WorldWideScience

Sample records for table-top earthquakes learn

  1. Table-top earthquakes; a demonstration of seismology for teachers and students that can be used to augment lessons in earth science, physics, math, social studies, geography

    Science.gov (United States)

    Lahr, J.C.

    1998-01-01

    The apparatus consists of a heavy object that is dragged steadily with an elastic cord. Although pulled with a constant velocity, the heavy object repeatedly slides and then stops. A small vibration sensor, attached to a computer display, graphically monitors this intermittent motion. 2 This intermittent sliding motion mimics the intermittent fault slippage that characterizes the earthquake fault zones. In tectonically active regions, the Earth's outer brittle shell, which is about 50 km thick, is slowly deformed elastically along active faults. As the deformation increases, stress also increases, until fault slippage releases the stored elastic energy. This process is called elastic rebound. Detailed instructions are given for assembly and construction of this demonstration. Included are suggested sources for the vibration sensor (geophone) and the computer interface. Exclusive of the personal computer, the total cost is between $125 and $150. I gave a talk at the Geological Society of America's Cordilleran Section Centennial meeting on June 2, 1999. The slides show how this table-top demonstration can be used to help meet many of the K-12 teaching goals described in Benchmarks for Science Literacy (American Association for the Advancement of Science, 1993).

  2. Table-top job analysis

    Energy Technology Data Exchange (ETDEWEB)

    1994-12-01

    The purpose of this Handbook is to establish general training program guidelines for training personnel in developing training for operation, maintenance, and technical support personnel at Department of Energy (DOE) nuclear facilities. TTJA is not the only method of job analysis; however, when conducted properly TTJA can be cost effective, efficient, and self-validating, and represents an effective method of defining job requirements. The table-top job analysis is suggested in the DOE Training Accreditation Program manuals as an acceptable alternative to traditional methods of analyzing job requirements. DOE 5480-20A strongly endorses and recommends it as the preferred method for analyzing jobs for positions addressed by the Order.

  3. Scenario-based table top simulations

    DEFF Research Database (Denmark)

    Broberg, Ole; Edwards, Kasper; Nielsen, J.

    2012-01-01

    This study developed and tested a scenario-based table top simulation method in a user-driven innovation setting. A team of researchers worked together with a user group of five medical staff members from the existing clinic. Table top simulations of a new clinic were carried out in a simple model...

  4. Table-top diffuse optical imaging

    NARCIS (Netherlands)

    Sturgeon, K.A.; Bakker, L.P.

    2006-01-01

    This report describes the work done during a six months internshipat Philips Research for a Masters in Electronic and Electrical Engineering. An existing table-top tomography system for measuring lightin phantom breasts was restored. Updated software control and image reconstruction software was

  5. Table Top Tennis: A Vehicle for Teaching Sportspersonship and Responsibility

    Science.gov (United States)

    Schwager, Susan; Stylianou, Michalis

    2012-01-01

    Table top tennis is a game that can be played in the classroom or lunchroom when the gymnasium is unavailable. It is a good activity for developing sportspersonship and responsibility in students in grades four and up. This article provides a description of table top tennis, including basic rules and strategies; an explanation of how it can…

  6. A novel shape-changing haptic table-top display

    Science.gov (United States)

    Wang, Jiabin; Zhao, Lu; Liu, Yue; Wang, Yongtian; Cai, Yi

    2018-01-01

    A shape-changing table-top display with haptic feedback allows its users to perceive 3D visual and texture displays interactively. Since few existing devices are developed as accurate displays with regulatory haptic feedback, a novel attentive and immersive shape changing mechanical interface (SCMI) consisting of image processing unit and transformation unit was proposed in this paper. In order to support a precise 3D table-top display with an offset of less than 2 mm, a custommade mechanism was developed to form precise surface and regulate the feedback force. The proposed image processing unit was capable of extracting texture data from 2D picture for rendering shape-changing surface and realizing 3D modeling. The preliminary evaluation result proved the feasibility of the proposed system.

  7. Feasibility study for an industrial superconducting table-top electron accelerator; Machbarkeitstudie fuer einen industriellen supraleitenden Table Top Elektronenbeschleuniger

    Energy Technology Data Exchange (ETDEWEB)

    Buettig, H.; Enghardt, W.; Gabriel, F.; Janssen, D.; Michel, P.; Pobell, F.; Prade, H.; Schneider, C.; Kudryavtsev, A.; Haberstroh, C.; Sandner, W.; Will, I.

    2004-07-01

    A concept of a table-top accelerator, consisting of a superconducting resonator and subsequent 6 standard TESLA cells working with a frequency of 1.3 GHz, is presented. Then electron gun is based on a photocathode. Especially described are the photocathode part, the laser system, the cryostat module, the RF system, the beam extraction, and the cryogenic facility. Finally the efficiency and the costs are considered, (HSI)

  8. A 10 tesla table-top controlled waveform magnet.

    Science.gov (United States)

    Roy Choudhury, Aditya N; Venkataraman, V

    2012-04-01

    Controlled waveform magnets (CWMs) are a class of pulsed magnets whose pulse shape with time can be programmed by the user. With a CWM, the user gains control not only over the magnitude of the field but also over its rate of change. In this work we present a table-top CWM, driven by a capacitor bank, capable of producing virtually any user-shaped magnetic field waveform up to 10 tesla. Insulated gate bipolar transistor chips have been paralleled to form the high current switch and paralleled chips of SiC Schottky diodes form the crowbar diode module. Sample controlled waveforms including flat-tops up to 10 tesla and some triangular magnetic field pulses have been successfully generated for 10-20 ms with a ripple Physics

  9. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  10. Generation of Medical X-ray and Terahertz Beams of Radiation Using Table-Top Accelerators

    OpenAIRE

    Baryshevsky, V.; Gurinovich, A.; Gurnevich, E.; Lobko, A.

    2013-01-01

    Theoretical and experimental studies of PXR and diffracted radiation of an oscillator in crystals combined with the development of VFEL generators with photonic crystals give a promising basis for creation of X-ray and THz sources using the same table-top accelerator. Multi-modal medical facility can be developed on the basis of one dedicated table-top electron accelerator of some tens of MeV energy. Such a system could find a lot of applications in medical practice and biomedical investigati...

  11. Learning from Earthquakes: 2014 Napa Valley Earthquake Reconnaissance Report

    OpenAIRE

    Fischer, Erica

    2014-01-01

    Structural damage was observed during reconnaissance after the 2014 South Napa Earthquake, and included damage to wine storage and fermentation tanks, collapse of wine storage barrel racks, unreinforced masonry building partial or full collapse, and residential building damage. This type of damage is not unique to the South Napa Earthquake, and was observed after other earthquakes such as the 1977 San Juan Earthquake, and the 2010 Maule Earthquake. Previous research and earthquakes have demon...

  12. Lessons learned from the 1994 Northridge Earthquake

    International Nuclear Information System (INIS)

    Eli, M.W.; Sommer, S.C.

    1995-01-01

    Southern California has a history of major earthquakes and also has one of the largest metropolitan areas in the United States. The 1994 Northridge Earthquake challenged the industrial facilities and lifetime infrastructure in the northern Los Angeles (LA) area. Lawrence Livermore National Laboratory (LLNL) sent a team of engineers to conduct an earthquake damage investigation in the Northridge area, on a project funded jointly by the United States Nuclear Regulatory Commission (USNRC) and the United States Department of Energy (USDOE). Many of the structures, systems, and components (SSCs) and lifelines that suffered damage are similar to those found in nuclear power plants and in USDOE facilities. Lessons learned from these experiences can have some applicability at commercial nuclear power plants

  13. Magnetic turbulence in a table-top laser-plasma relevant to astrophysical scenarios

    Science.gov (United States)

    Chatterjee, Gourab; Schoeffler, Kevin M.; Kumar Singh, Prashant; Adak, Amitava; Lad, Amit D.; Sengupta, Sudip; Kaw, Predhiman; Silva, Luis O.; Das, Amita; Kumar, G. Ravindra

    2017-06-01

    Turbulent magnetic fields abound in nature, pervading astrophysical, solar, terrestrial and laboratory plasmas. Understanding the ubiquity of magnetic turbulence and its role in the universe is an outstanding scientific challenge. Here, we report on the transition of magnetic turbulence from an initially electron-driven regime to one dominated by ion-magnetization in a laboratory plasma produced by an intense, table-top laser. Our observations at the magnetized ion scale of the saturated turbulent spectrum bear a striking resemblance with spacecraft measurements of the solar wind magnetic-field spectrum, including the emergence of a spectral kink. Despite originating from diverse energy injection sources (namely, electrons in the laboratory experiment and ion free-energy sources in the solar wind), the turbulent spectra exhibit remarkable parallels. This demonstrates the independence of turbulent spectral properties from the driving source of the turbulence and highlights the potential of small-scale, table-top laboratory experiments for investigating turbulence in astrophysical environments.

  14. Table-top trainings in radiation protection. Educational element or emergency planning?

    International Nuclear Information System (INIS)

    Stolar, A.

    2009-01-01

    Education plays an important role in emergency management to prepare members of all levels of management for the worst case scenario. The mission that organizations have to deal with, is based on the application of fundamental knowledge, accumulated know-how and knowledge of the intersections and abilities of the participating organizations. An effective, safe and resource-saving way to get effective help in preparing disasters are table-top trainings. What great warlords helped to win centuries ago, is now increasingly anchored on a statutory basis and introduced in the emergency planning. (orig.)

  15. Intraoperative Catastrophic Failure of a Mizuho OSI Orthopedic Trauma Table Top: A Case Report.

    Science.gov (United States)

    Andrews, Colin R; Stapinski, Brian; Fox, Edward

    2016-01-01

    During orthopaedic open reduction and internal fixation, early fatigue failure of a Mizuho OSI Orthopedic Trauma Table Top occurred. The patient fell toward the ground but was uninjured. A material failure characterized by a crack in the spar tube leading to complete table component separation was identified. To our knowledge, this report is the first of its kind to specifically highlight surgical table device failure intraoperatively. Although rare, early fatigue failure of operating tables is possible, leading to hazardous intraoperative situations and the potential for serious patient injury or death. Operating tables and equipment should be inspected rigorously and with proper documentation to prevent such events.

  16. Learning Earthquake Design and Construction 16. How to make ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 3. Learning Earthquake Design and Construction 16. How to make Stone Masonry Buildings Earthquake Resistant? C V R Murty. Classroom Volume 10 Issue 3 March 2005 pp 92-95 ...

  17. Learning Earthquake Design and Construction–Why are Open ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 10. Learning Earthquake Design and Construction – Why are Open-Ground Storey Buildings Vulnerable in Earthquakes? C V R Murty. Classroom Volume 10 Issue 10 October 2005 pp 84-87 ...

  18. Learning Earthquake Design and Construction–12. How do Brick ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 1. Learning Earthquake Design and Construction – 12. How do Brick Masonry Houses Behave during Earthquakes? C V R Murty. Classroom Volume 10 Issue 1 January 2005 pp 88-90 ...

  19. Learning Earthquake Design and Construction 6. How Architectural ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 10. Learning Earthquake Design and Construction – 6. How Architectural Features Affect Building During Earthquakes? C V R Murty. Classroom Volume 9 Issue 10 October 2004 pp 82-85 ...

  20. Learning Earthquake Design and Construction–Why are Short ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 10. Learning Earthquake Design and Construction – Why are Short Columns more Damaged During Earthquakes? C V R Murty. Classroom Volume 10 Issue 10 October 2005 pp 88-91 ...

  1. Learning Earthquake Design and Construction–10. How Flexibility ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 12. Learning Earthquake Design and Construction – 10. How Flexibility of Buildings Affects their Earthquake Response. C V R Murty. Classroom Volume 9 Issue 12 December 2004 pp 74-77 ...

  2. Learning Earthquake Design and Construction-17. How do ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 4. Learning Earthquake Design and Construction - 17. How do Earthquakes Affect Reinforced Concrete Buildings? C V R Murty. Classroom Volume 10 Issue 4 April 2005 pp 83-86 ...

  3. Learning Earthquake Design and Construction 20. How do Beam ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 6. Learning Earthquake Design and Construction – How do Beam–Column Joints in RC Buildings Resist Earthquakes? C V R Murty. Classroom Volume 10 Issue 6 June 2005 pp 82-85 ...

  4. Thermal CFD study and improvement of table top fridge evaporator by virtual prototyping

    Directory of Open Access Journals (Sweden)

    Georgi Todorov

    2017-09-01

    Full Text Available The present paper aims to assess and to improve existing design of evaporators for household table top refrigeration appliances using Computational Fluid Dynamics (CFD. This category of refrigerators are compact and cheap solutions for domestic appliance. The requirement for low cost solution does not cancel necessity of high effectivity, usually referred as “energy class”. The evaporator is important component of refrigerator heat transport system and to its efficiency. Existing design of evaporator is improved in two directions – as shape of the serpentine and as cross section – constrained by overall cost limit. Two groups of thermal CFD analyses are performed over various design variants. Used virtual prototypes enable to view in detail heat transfer process and to reach an better solution in means of overall price/performance. This study shows the effect of serpentine geometry on evaporator performance as well as demonstrates the benefits of virtual prototyping when targeting optimization and improvement.

  5. International experience with a multidisciplinary table top exercise for response to a PWR accident

    International Nuclear Information System (INIS)

    Lakey, J.R.A.

    1996-01-01

    Table Top Exercises are used for the training of emergency response personnel from a wide range of disciplines whose duties range from strategic to tactical, from managerial to operational. The exercise reported in this paper simulates the first two or three hours of an imaginary accident on a generic PWR site (named Seaside or Lakeside depending on its location). It is designed to exercise the early response of staff of the utility, government, local authority and the media and some players represent the public. The relatively few scenarios used for this exercise are based on actual events scaled to give off-site consequences which demand early assessment and therefore stress the communication procedures. The exercise is applicable in different cultures and has been used in over 20 short courses held in the USA, UK, Sweden, Prague, and Hong Kong. There are two styles of support for players: a linear program which ensures that all players follow the desired path through the event and an open program which is triggered by umpires (who play the reactor crew from a script) and by requests from other players. In both cases the exercise ends with a Press Conference. Players have an initial briefing and are assigned to roles; those who must speak at interviews and at the Press Conference arc given separate briefing by an expert in Public Affairs. The exercise runs with up to six groups and the communication rate reaches about 30 to 40 messages per hour for each group. The exercise can be applied to test management and communication systems and to study human response to emergencies because the merits of individual players are highlighted in the relatively stressful conditions of the initial stage of an accident. For some players the exercise is the first time that they have been required to carry out their task in front of other people

  6. The 2016 Kumamoto, Japan, earthquakes and lessons learned for large earthquakes in urban areas

    Science.gov (United States)

    Hirata, Naoshi; Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2017-04-01

    A series of devastating earthquakes hit the Kumamoto districts in Kyushu, Japan, in April 2016. A M6.5 event occurred at 21:26 on April 14th (JST) and, 28 hours later, a M7.3 event occurred at 01:25 on April 17th (JST) at almost the same location at a depth of 10 km. Both earthquakes were felt at the town of Mashiki with a seismic intensity of 7 according to the Japan Meteorological Agency (JMA) scale. The intensity of 7 is the highest level in the JMA scale. Very strong accelerations were observed by the M6.5 event with 1,580 gal at KiKnet Mashiki station and 1,791 gal for the M7.3 event at Ohtsu City station. As a result, more than 8,000 houses totally collapsed, 26,000 were heavily damaged, and 120,000 were partially damaged. More than 170 people were killed by the two earthquakes. The important lesson from the Kumamoto earthquake is that very strong ground motions may hit within a few days after a first large event. This can have serious impacts to houses already damaged by the first large earthquake. In the 2016 Kumamoto sequence, there were also many strong aftershocks including M5.8-5.9 events until April 18th. More than 180,000 people had to take shelter because of ongoing strong aftershocks. We discuss both the natural and human aspects of the Kumamoto earthquake disaster caused by inland shallow large earthquakes. We will report on the lessons learned for large earthquakes hitting the metropolitan area of Tokyo, Japan.

  7. "Earthquake!"--A Cooperative Learning Experience.

    Science.gov (United States)

    Hodder, A. Peter W.

    2001-01-01

    Presents an exercise designed as a team building experience for managers that can be used to demonstrate to science students the potential benefit of group decision-making. Involves the ranking of options for surviving a large earthquake. Yields quantitative measures of individual student knowledge and how well the groups function. (Author/YDS)

  8. Toward the Extreme Ultra Violet Four Wave Mixing Experiments: From Table Top Lasers to Fourth Generation Light Sources

    OpenAIRE

    Riccardo Cucini; Andrea Battistoni; Filippo Bencivenga; Alessandro Gessini; Riccardo Mincigrucci; Erika Giangrisostomi; Emiliano Principi; Flavio Capotondi; Emanuele Pedersoli; Michele Manfredda; Maya Kiskinova; Claudio Masciovecchio

    2015-01-01

    Three different Transient Grating setups are presented, with pulsed and continuous wave probe at different wavelengths, ranging from infrared to the extreme ultra violet region. Both heterodyne and homodyne detections are considered. Each scheme introduces variations with respect to the previous one, allowing moving from classical table top laser experiments towards a new four wave mixing scheme based on free electron laser radiation. A comparison between the various setups and the first resu...

  9. Learning Earthquake Design and Construction 13. Why Should ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 2. Learning Earthquake Design and Construction 13. Why Should Masonry Buildings have Simple Structural Configuration? C V R Murty. Classroom Volume 10 Issue 2 February 2005 pp 79-82 ...

  10. Learning Earthquake Design and Construction 8. What is the ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 11. Learning Earthquake Design and Construction – 8. What is the Seismic Design Philosophy for Buildings? C V R Murty. Classroom Volume 9 Issue 11 November 2004 pp 89-93 ...

  11. Learning Earthquake Design and Construction–4. Where are the ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 9. Learning Earthquake Design and Construction – 4. Where are the Seismic Zones in India? C V R Murty. Classroom Volume 9 Issue 9 September 2004 pp 83-87. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Learning Earthquake Design and Construction-2. How the Groll: nd ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 9; Issue 8. Learning Earthquake Design and Construction – 2. How the Ground Shakes! C V R Murty. Classroom Volume 9 Issue 8 August 2004 pp 79-82. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Learning Earthquake Design and Construction 14. Why are ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 2. Learning Earthquake Design and Construction 14. Why are Horizontal Bands Necessary in Masonry Buildings? C V R Murty. Classroom Volume 10 Issue 2 February 2005 pp 83-85 ...

  14. Learning Earthquake Design and Construction–23. Why are ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 11. Learning Earthquake Design and Construction – 23. Why are Buildings with Shear Walls Preferred in Seismic Regions? C V R Murty. Classroom Volume 10 Issue 11 November 2005 pp 85-88 ...

  15. GIS learning tool for world's largest earthquakes and their causes

    Science.gov (United States)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  16. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  17. Earthquakes.

    Science.gov (United States)

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  18. Lessons learned from the Wenchuan earthquake.

    Science.gov (United States)

    Shen, Ji; Kang, Junxing; Shi, Yingkang; Li, Youping; Li, Yuanfeng; Su, Lin; Wu, Jianlin; Zheng, Shangwei; Jiang, Jie; Hu, Weijian; Yang, Yong; Tang, Xuefeng; Wen, Jin; Li, Ling; Shen, Jiantong; Zhong, Dake

    2012-05-01

    To conclude experience and lessons from emergency medical rescue after Wenchuan Earthquake from national and overall review for consideration on worldwide catastrophe rescue in the future. To systematically collect huge amount of primary data, and to make analysis, draw conclusions and lessons in terms of five aspects respectively as quake-damage conditions, command system, emergency medical rescue, prevention and control over infectious diseases as well as pairing-assistance for medical system and service reconstruction. 1. Numbers as of the death, injured and migrants made Wenchuan Earthquake ranked one of the top 9 catastrophes around the world during the past two decades. 2. Countermeasures such as four-level linkage by nation-province-city-county model, mutual assistance between military force and local forces, frontline commanding did effectively ensure the dispatch and cooperation among rescue forces. 3. Three-leveled medical transfers, "four concentrations" prevention and treatment besides whole-course rehabilitation at early stage managed to lower mortality and disability rate to minimum levels respectively. 4. "Four-keynote infectious disease control" under whole coverage and "five measures and four reinforcement measures" in settlements made rates as for those infectious diseases under the average level as those in the 3 pre-quake years. 5. Pairing-assistance in terms of talents, finance, materials as well as capacity building between other 18 provinces/municipalities and those 18 extremely-stricken/severely-stricken areas in Sichuan Province guaranteed efficient post-quake reconstruction, system reconstruction and long-term mechanism construction. Successful experience from Wenchuan Earthquake could be summarized as: one goal as people-oriented life-rescuing. Two tasks as medical rescue for diseases of those injured and healthcare & anti-epidemic for safe and sound of those lives. Three strategies respectively as medical transfers after on-site triage

  19. Toward the Extreme Ultra Violet Four Wave Mixing Experiments: From Table Top Lasers to Fourth Generation Light Sources

    Directory of Open Access Journals (Sweden)

    Riccardo Cucini

    2015-01-01

    Full Text Available Three different Transient Grating setups are presented, with pulsed and continuous wave probe at different wavelengths, ranging from infrared to the extreme ultra violet region. Both heterodyne and homodyne detections are considered. Each scheme introduces variations with respect to the previous one, allowing moving from classical table top laser experiments towards a new four wave mixing scheme based on free electron laser radiation. A comparison between the various setups and the first results from extreme ultra violet transient grating experiments is also discussed.

  20. A table-top x-ray FEL based on a laser wakefield accelerator-undulator system

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, K.; Kawakubo, T.; Nakanishi, H. [National Lab. for High Energy Physics, Ibaraki-ken (Japan)] [and others

    1995-12-31

    Ultrahigh-gradient electron acceleration has been confirmed owing to the laser wakefield acceleration mechanism driven by an intense short laser wakefield acceleration mechanism driven by an intense short laser pulse in an underdense plasma. The laser wakefield acceleration makes it possible to build a compact electron linac capable of producing an ultra-short bunched electron beam. While the accelerator is attributed to longitudinal wakefields, transverse wakefields simultaneously generated by a short laser pulse can serve as a plasma undulator with a very short wavelength equal to a half of the plasma wavelength. We propose a new FEL concept for X-rays based on a laser wakefield accelerator-undulator system driven by intense short laser pulses delivered from table-top terawatt lasers. The system is composed of the accelerator stage and the undulator stage in a table-top size. A low energy electron beam is accelerated an bunched into microbunches due to laser wakefields in the accelerator stage. A micro-bunched beam travelling to the opposite direction of driving laser pulses produces coherent X-ray radiation in the undulator stage. A practical configuration and its analyses are presented.

  1. High Resolution Imaging of a Dense Micro-capillary Plasma with a Table-top Soft X-Ray Laser.

    Science.gov (United States)

    Rocca, J. J.; Marconi, M. C.; Moreno, C. H.; Macchietto, C. D.; Shlyaptsev, V. N.

    1998-11-01

    We report the first use of a table-top soft x-ray laser in the imaging of dense plasmas. Due to their short wavelength, high brightness, short pulse duration and high degree of collimation, soft x-ray lasers are excellent radiation sources to perform shadowgraphy studies in dense plasmas. Recently, a Ne-like Y x-ray laser pumped by the Nova laser was used at Lawrence Livermore National Lab. to image with micrometer-scale resolution laser-accelerated and laser-exploded foils(R. Cauble et al), Phys. Rev. Lett. 74, 3816, (1995). Now, the advent of saturated table top soft x-ray lasers(J.J. Rocca et al), Phys. Rev. Lett. 77, 1476, (1996)^,(B. Benware et al), Opt. Lett. 22, 796, (1997) has opened the possibility to probe a wide variety of dense plasmas. We have obtained a sequence of high resolution (≈ 5 μm, 0.6-0.7 ns) shadowgrams that map the evolution of the plasma of a 380 μm micro-capillary discharge using a capillary discharge-pumped 46.9 nm laser backlighter. The measurements show that plasma evolves from an initially non-uniform distribution into a cylindrically symmetric plasma column with a density minimum on axis. This work was supported by DOE grant DE-FG03-98DP00208. We also acknowledge the support of NSF for the development of the laser.

  2. The 2015 Nepal earthquake disaster: lessons learned one year on.

    Science.gov (United States)

    Hall, M L; Lee, A C K; Cartwright, C; Marahatta, S; Karki, J; Simkhada, P

    2017-04-01

    The 2015 earthquake in Nepal killed over 8000 people, injured more than 21,000 and displaced a further 2 million. One year later, a national workshop was organized with various Nepali stakeholders involved in the response to the earthquake. The workshop provided participants an opportunity to reflect on their experiences and sought to learn lessons from the disaster. One hundred and thirty-five participants took part and most had been directly involved in the earthquake response. They included representatives from the Ministry of Health, local and national government, the armed forces, non-governmental organizations, health practitioners, academics, and community representatives. Participants were divided into seven focus groups based around the following topics: water, sanitation and hygiene, hospital services, health and nutrition, education, shelter, policy and community. Facilitated group discussions were conducted in Nepalese and the key emerging themes are presented. Participants described a range of issues encountered, some specific to their area of expertize but also more general issues. These included logistics and supply chain challenges, leadership and coordination difficulties, impacts of the media as well as cultural beliefs on population behaviour post-disaster. Lessons identified included the need for community involvement at all stages of disaster response and preparedness, as well as the development of local leadership capabilities and community resilience. A 'disconnect' between disaster management policy and responses was observed, which may result in ineffective, poorly planned disaster response. Finding time and opportunity to reflect on and identify lessons from disaster response can be difficult but are fundamental to improving future disaster preparedness. The Nepal Earthquake National Workshop offered participants the space to do this. It garnered an overwhelming sense of wanting to do things better, of the need for a Nepal-centric approach

  3. Evaluation of dosimetric effects caused by the table top of therapy; Avaliacao dos efeitos dosimetricos causados pelo tampo da mesa de tratamento

    Energy Technology Data Exchange (ETDEWEB)

    Camargo, Andre Vinicius de; Alvares, Bruno; Fioravante, Gustavo Donisete; Silva, Diego da Cunha Silveira Alves da; Giglioli, Milena; Batista, Felipe Placido; Silva, Lais Bueno da; Radicchi, Lucas Augusto [Hospital de Cancer de Barretos, SP (Brazil)

    2016-07-01

    The attenuation and bolus effect for two tables top from different manufacturers were investigated for 6MV photons. The bolus effect of couch was compared with 0,5cm bolus (water equivalent). Maximum attenuation found in Exact Couch table was 6,9% and the minimum was 0,63%. The rail of Exact Couch, for beam in 180 deg, was observed attenuation of 13,61%. The same way that for attenuation, the surface dose was different for each region of couch Exact Couch and for different components of iBeam evo. The percentage of the dose in the depth of 1,8 mm was greater for table top of Exact Couch (66,2%). The extender of table iBeam evo offered increase dose of 38,3% and it table top of 51,9% in the same depth. The bolus increased surface dose in 61,1%. The results of this study showed that table tops when in contact with surface of the patient may significantly increase surface dose and beam attenuation. (author)

  4. Earthquakes

    Science.gov (United States)

    ... Centers Evacuation Center Play Areas Animals in Public Evacuation Centers Pet Shelters Interim Guidelines for Animal Health and Control of Disease Transmission in Pet Shelters Protect Your Pets Earthquakes Language: English (US) Español (Spanish) Recommend on Facebook ...

  5. Seismic-resistant design of nuclear power stations in Japan, earthquake country. Lessons learned from Chuetsu-oki earthquake

    International Nuclear Information System (INIS)

    Irikura, Kojiro

    2008-01-01

    The new assessment (back-check) of earthquake-proof safety was being conducted at Kashiwazaki-Kariwa Nuclear Power Plants, Tokyo Electric Co. in response to a request based on the guideline for reactor evaluation for seismic-resistant design code, revised in 2006, when the 2007 Chuetsu-oki Earthquake occurred and brought about an unexpectedly huge tremor in this area, although the magnitude of the earthquake was only 6.8 but the intensity of earthquake motion exceeded 2.5-fold more than supposed. This paper introduces how and why the guideline for seismic-resistant design of nuclear facilities was revised in 2006, the outline of the Chuetsu-oki Earthquake, and preliminary findings and lessons learned from the Earthquake. The paper specifically discusses on (1) how we may specify in advance geologic active faults as has been overlooked this time, (2) how we can make adequate models for seismic origin from which we can extract its characteristics, and (3) how the estimation of strong ground motion simulation may be possible for ground vibration level of a possibly overlooked fault. (S. Ohno)

  6. A table top experiment to investigate production and properties of a plasma confined by a dipole magnet

    Science.gov (United States)

    Baitha, Anuj Ram; Kumar, Ashwani; Bhattacharjee, Sudeep

    2018-02-01

    We report a table top experiment to investigate production and properties of a plasma confined by a dipole magnet. A water cooled, strong, cylindrical permanent magnet (NdFeB) magnetized along the axial direction and having a surface magnetic field of ˜0.5 T is employed to create a dipole magnetic field. The plasma is created by electron cyclotron resonance heating. Visual observations of the plasma indicate that radiation belts appear due to trapped particles, similar to the earth's magnetosphere. The electron temperature lies in the range 2-13 eV and is hotter near the magnets and in a downstream region. It is found that the plasma (ion) density reaches a value close to 2 × 1011 cm-3 and peaks at a radial distance about 3 cm from the magnet. The plasma beta β (β = plasma pressure/magnetic pressure) increases radially outward, and the maximum β for the present experimental system is ˜2%. It is also found that the singly charged ions are dominant in the discharge.

  7. A table top experiment to investigate production and properties of a plasma confined by a dipole magnet.

    Science.gov (United States)

    Baitha, Anuj Ram; Kumar, Ashwani; Bhattacharjee, Sudeep

    2018-02-01

    We report a table top experiment to investigate production and properties of a plasma confined by a dipole magnet. A water cooled, strong, cylindrical permanent magnet (NdFeB) magnetized along the axial direction and having a surface magnetic field of ∼0.5 T is employed to create a dipole magnetic field. The plasma is created by electron cyclotron resonance heating. Visual observations of the plasma indicate that radiation belts appear due to trapped particles, similar to the earth's magnetosphere. The electron temperature lies in the range 2-13 eV and is hotter near the magnets and in a downstream region. It is found that the plasma (ion) density reaches a value close to 2 × 10 11 cm -3 and peaks at a radial distance about 3 cm from the magnet. The plasma beta β (β = plasma pressure/magnetic pressure) increases radially outward, and the maximum β for the present experimental system is ∼2%. It is also found that the singly charged ions are dominant in the discharge.

  8. Energy extraction and achievement of the saturation limit in a discharge pumped table-top soft x-ray amplifier.

    Science.gov (United States)

    Rocca, J. J.; Clark, D. P.; Chilla, J. L. A.; Shlyaptsev, V. N.; Marconi, M. C.

    1996-11-01

    There is significant interest in the demonstration of compact soft x-ray amplifiers capable of generating pulses of substantial energy for applications. This motivates the demonstration of gain media generated by compact devices, that can be successfully scaled in length to reach gain saturation. To date, gain saturation had only been achieved in a few soft x-ray laser lines in plasmas generated by some of the world's largest laser facilities.(B. J. MacGowan et al.), Phys. Fluids B 4, 2326 (1992); A. Carillon et al., Phys. Rev. Lett 68, 2917 (1992);B. Rus et al., in AIP Conf. Proc. 332, X-ray lasers 1994, p. 152; S. Wang et al., ibid., p. 293. Previosly we reported large amplification at 46.9 nm in Ne-like argon in a plasma column generated by a fast capillary discharge.(J. J. Rocca et al.), Phys. Rev. Lett. 73, 2192 (1994). Herein we report the generation of laser pulse energies up to 30 μJ at 46.9 nm in such discharge and the first clear evidence of gain saturation of a table-top soft x-ray amplifier. Single pass amplification experiments yielded laser pulse energies up to 6 μJ and double pass amplification using an iridium mirror yielded 30 μJ. The observed saturation of the gain and laser pulse energy are in good agreement with the results of radiation transport calculations. Work supported by the National Science Foundation.

  9. Kinetics of Polymer-Fullerene Phase Separation during Solvent Annealing Studied by Table-Top X-ray Scattering.

    Science.gov (United States)

    Vegso, Karol; Siffalovic, Peter; Jergel, Matej; Nadazdy, Peter; Nadazdy, Vojtech; Majkova, Eva

    2017-03-08

    Solvent annealing is an efficient way of phase separation in polymer-fullerene blends to optimize bulk heterojunction morphology of active layer in polymer solar cells. To track the process in real time across all relevant stages of solvent evaporation, laboratory-based in situ small- and wide-angle X-ray scattering measurements were applied simultaneously to a model P3HT:PCBM blend dissolved in dichlorobenzene. The PCBM molecule agglomeration starts at ∼7 wt % concentration of solid content of the blend in solvent. Although PCBM agglomeration is slowed-down at ∼10 wt % of solid content, the rate constant of phase separation is not changed, suggesting agglomeration and reordering of P3HT molecular chains. Having the longest duration, this stage most affects BHJ morphology. Phase separation is accelerated rapidly at concentration of ∼25 wt %, having the same rate constant as the growth of P3HT crystals. P3HT crystallization is driving force for phase separation at final stages before a complete solvent evaporation, having no visible temporal overlap with PCBM agglomeration. For the first time, such a study was done in laboratory demonstrating potential of the latest generation table-top high-brilliance X-ray source as a viable alternative before more sophisticated X-ray scattering experiments at synchrotron facilities are performed.

  10. THz and Sub-THz Capabilities of a Table-Top Radiation Source Driven by an RF Thermionic Electron Gun

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexei V.; Agustsson, R.; Boucher, S.; Campese, Tara; Chen, Y.C.; Hartzell, Josiah J.; Jocobson, B.T.; Murokh, A.; O' Shea, F.H.; Spranza, E.; Berg, W.; Borland, M.; Dooling, J. C.; Erwin, L.; Lindberg, R. R.; Pasky, S.J.; Sereno, N.; Sun, Y.; Zholents, A.

    2017-06-01

    Design features and experimental results are presented for a sub-mm wave source [1] based on APS RF thermionic electron gun. The setup includes compact alpha-magnet, quadrupoles, sub-mm-wave radiators, and THz optics. The sub-THz radiator is a planar, oversized structure with gratings. Source upgrade for generation frequencies above 1 THz is discussed. The THz radiator will use a short-period undulator having 1 T field amplitude, ~20 cm length, and integrated with a low-loss oversized waveguide. Both radiators are integrated with a miniature horn antenna and a small ~90°-degree in-vacuum bending magnet. The electron beamline is designed to operate different modes including conversion to a flat beam interacting efficiently with the radiator. The source can be used for cancer diagnostics, surface defectoscopy, and non-destructive testing. Sub-THz experiment demonstrated a good potential of a robust, table-top system for generation of a narrow bandwidth THz radiation. This setup can be considered as a prototype of a compact, laser-free, flexible source capable of generation of long trains of Sub-THz and THz pulses with repetition rates not available with laser-driven sources.

  11. People bouncing on trampolines: dramatic energy transfer, a table-top demonstration, complex dynamics and a zero sum game.

    Directory of Open Access Journals (Sweden)

    Manoj Srinivasan

    Full Text Available Jumping on trampolines is a popular backyard recreation. In some trampoline games (e.g., "seat drop war", when two people land on the trampoline with only a small time-lag, one person bounces much higher than the other, as if energy has been transferred from one to the other. First, we illustrate this energy-transfer in a table-top demonstration, consisting of two balls dropped onto a mini-trampoline, landing almost simultaneously, sometimes resulting in one ball bouncing much higher than the other. Next, using a simple mathematical model of two masses bouncing passively on a massless trampoline with no dissipation, we show that with specific landing conditions, it is possible to transfer all the kinetic energy of one mass to the other through the trampoline - in a single bounce. For human-like parameters, starting with equal energy, the energy transfer is maximal when one person lands approximately when the other is at the bottom of her bounce. The energy transfer persists even for very stiff surfaces. The energy-conservative mathematical model exhibits complex non-periodic long-term motions. To complement this passive bouncing model, we also performed a game-theoretic analysis, appropriate when both players are acting strategically to steal the other player's energy. We consider a zero-sum game in which each player's goal is to gain the other player's kinetic energy during a single bounce, by extending her leg during flight. For high initial energy and a symmetric situation, the best strategy for both subjects (minimax strategy and Nash equilibrium is to use the shortest available leg length and not extend their legs. On the other hand, an asymmetry in initial heights allows the player with more energy to gain even more energy in the next bounce. Thus synchronous bouncing unstable is unstable both for passive bouncing and when leg lengths are controlled as in game-theoretic equilibria.

  12. Lessons learned from the 2016 Kumamoto earthquake: Building damages and behavior of seismically isolated buildings

    Science.gov (United States)

    Morita, Keiko; Takayama, Mineo

    2017-10-01

    Powerful earthquakes stuck Kumamoto and Oita Prefectures in Kyushu, Japan. It began with the Magnitude 6.5 foreshock at 21:26 JST 14 April, followed by the Magnitude 7.3 mainshock at 1:25 JST 16 April, 2016. The sequence earthquakes also involved more than 1700 perceptible earthquakes as of 13 June. The entire sequence was named the 2016 Kumamoto earthquake by the Japan Meteorological Agency. Thousands of buildings and many roads were damaged, and landslides occurred. The Japanese building standard law is revised in 1981. Structural damages were concentrated on buildings constructed prior to 1981. The area of Mashiki and Southern Aso were most badly affected, especially wooden houses extremely damaged. In Japan, Prof. Hideyuki Tada (title at the time) undertook research on laminated rubber bearings in 1978, and put it into practical use in 1981. The single family house at Yachiyodai, Chiba Prefecture is completed in 1983, it's the first seismically isolated building which is installed laminated rubber bearings in Japan. Afterward, this system is gradually adopted to mainly office buildings, like a research laboratory, a hospital, a computer center and other offices. In the 1994 Northridge earthquake, the 1995 Kobe earthquake and 2011 Tohoku earthquake, seismically isolated buildings improve these good performances, and recently number of the buildings have increased, mainly high risk area of earthquakes. Many people believed that Kumamoto was a low risk area. But there were 24 seismically isolated buildings in Kumamoto Prefecture at the time. The seismically isolated buildings indicated excellent performances during the earthquakes. They protected people, buildings and other important facilities from damages caused by the earthquake. The purpose of this paper is to discuss lessons learned from the 2016 Kumamoto earthquake and behavior of seismically isolated buildings in the earthquake.

  13. Earthquake: Game-based learning for 21st century STEM education

    Science.gov (United States)

    Perkins, Abigail Christine

    To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having

  14. Learning from physics-based earthquake simulators: a minimal approach

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  15. Open table-top device positioning technique to reduce small bowel obstruction. Positioning accuracy and impact on conformal radiation therapy techniques

    International Nuclear Information System (INIS)

    Rudat, V.; Flentje, M.; Engenhart, R.; Metzger, M.; Wannenmacher, M.

    1995-01-01

    The immobilization error of patients positioned on the opern table-top device in prone prosition as well as the movement of the small bowel out of the pelvis by this positioning technique was determined. The positioning error is of special importance for the 3-dimensional treatment planning for conformal radiotherapy. The positioning error was determined by superposing 106 portal films with the corresponding simultor films from 21 patients with carcinoma of the rectum who received 3D-planned conformal radiotherapy (o-field technique with irregular blocks). The movement of the small bowel out of the pelvis was studied by comparing simulator films after barium swallow in supine and open table-top position as well with 3D-treatment plans of the same patient in both positions in 3 cases. The positioning error along the medio-lateral, dorso-ventral und cranio-caudal axis was 1.4/-0.6/1.8 mm and the standard deviation 4.4/6.8/6.3 mm, respectively. In comparison to the supine position more rotation errors in the sagittal view were observed (37% and 9% respectively) with a media of 5.1 . Six out of 22 patients showed no adhesions of the small bowel and a complete movement out of the tratment field was achieved. 14 out of 16 Patients with adhesions revealed a partial movement of the small bowel out of the treatment field. Comparing 3D-treatment plans in both positions again demonstrated a marked reduction of the irradiated small bowel volume with the use of the open table-top decive. (orig.) [de

  16. Demonstration of a 100 Hz repetition rate gain-saturated diode-pumped table-top soft x-ray laser.

    Science.gov (United States)

    Reagan, Brendan A; Wernsing, Keith A; Curtis, Alden H; Furch, Federico J; Luther, Bradley M; Patel, Dinesh; Menoni, Carmen S; Rocca, Jorge J

    2012-09-01

    We demonstrate the operation of a gain-saturated table-top soft x-ray laser at 100 Hz repetition rate. The laser generates an average power of 0.15 mW at λ=18.9  nm, the highest laser power reported to date from a sub-20-nm wavelength compact source. Picosecond laser pulses of 1.5 μJ energy were produced at λ=18.9  nm by amplification in a Mo plasma created by tailoring the temporal intensity profile of single pump pulses with 1 J energy produced by a diode-pumped chirped pulse amplification Yb:YAG laser. Lasing was also obtained in the 13.9 nm line of Ni-like Ag. These results increase by an order of magnitude the repetition rate of plasma-based soft x-ray lasers opening the path to milliwatt average power table-top lasers at sub-20 nm wavelengths.

  17. Learning Earthquake Design and Construction–23. Why are ...

    Indian Academy of Sciences (India)

    Earthquake Tips have been brought out by the Department of Civil Engineering, lIT. Kanpur, and sponsored by Building Materials and Technology Promotion Council,. New Delhi, India. These articles are reproduced here with permission from lIT. Kanpur and BMTPC, New Delhi. C V R Murty. Indian Institute of Technology.

  18. Learning Earthquake Design and Construction–23. Why are ...

    Indian Academy of Sciences (India)

    RC shafts around the elevator core of buildings also act as shear walls, and should be taken advantage of to resist earthquake forces. Reinforcement Bars in RC Walls: Steel reinforcing bars are to be provided in walls in regularly spaced vertical and. ______ .AAAAA~ ______ __. RESONANCE I November 2005 v V V V V v ...

  19. Learning Earthquake Design and Construction-2. How the Groll: nd ...

    Indian Academy of Sciences (India)

    Large strain energy released during an earthquake travels as seismic waves in all directions through the Earth's layers, reflecting and refracting at each interface. These waves are of two types - body waves and surface waves; the latter are restricted to near the Earth's surface (Figure 1). Body waves consist of. Primary ...

  20. Proactive vs. reactive learning on buildings response and earthquake risks, in schools of Romania

    Directory of Open Access Journals (Sweden)

    Daniela DOBRE

    2015-07-01

    Full Text Available During the last 20 years, many specific activities of earthquake education and preparedness were initiated and supported in Romania by drafting materials for citizens, students, professors etc. (Georgescu et al., 2004, 2006. The education, training and information on earthquake disaster potential are important factors to mitigate the earthquake effects. Such activities, however, need time to be developed and may take different forms of presentation in order to capture the attention, to increase interest, to develop skills and attitudes in order to induce a proper behavior towards safety preparedness. It shall also be based on the accumulation of concerns and knowledge, which are, in principle, a consequence of the motivation, but which depend on the methods applied and actions taken for efficient earthquake preparedness, assessed and updated following actual earthquakes (Masuda, Midorikawa, Miki and Ohmachi, 1988. We are now at a crossroad and the proactive attitude and behavior (anticipative and participative needs to be extended in learning, within institutional framework, but correlated with the usual targets of schools and teenagers proactive issue (ROEDUSEIS-NET; Page and Page, 2003, by encouraging students in activities closer to earthquake engineering.

  1. Application of Geostatistical Methods and Machine Learning for spatio-temporal Earthquake Cluster Analysis

    Science.gov (United States)

    Schaefer, A. M.; Daniell, J. E.; Wenzel, F.

    2014-12-01

    Earthquake clustering tends to be an increasingly important part of general earthquake research especially in terms of seismic hazard assessment and earthquake forecasting and prediction approaches. The distinct identification and definition of foreshocks, aftershocks, mainshocks and secondary mainshocks is taken into account using a point based spatio-temporal clustering algorithm originating from the field of classic machine learning. This can be further applied for declustering purposes to separate background seismicity from triggered seismicity. The results are interpreted and processed to assemble 3D-(x,y,t) earthquake clustering maps which are based on smoothed seismicity records in space and time. In addition, multi-dimensional Gaussian functions are used to capture clustering parameters for spatial distribution and dominant orientations. Clusters are further processed using methodologies originating from geostatistics, which have been mostly applied and developed in mining projects during the last decades. A 2.5D variogram analysis is applied to identify spatio-temporal homogeneity in terms of earthquake density and energy output. The results are mitigated using Kriging to provide an accurate mapping solution for clustering features. As a case study, seismic data of New Zealand and the United States is used, covering events since the 1950s, from which an earthquake cluster catalogue is assembled for most of the major events, including a detailed analysis of the Landers and Christchurch sequences.

  2. The 2011 Virginia earthquake: What are scientists learning?

    Science.gov (United States)

    Horton, J. Wright, Jr.; Williams, Robert A.

    2012-08-01

    Nearly 1 year ago, on 23 August, tens of millions of people in the eastern United States and southeastern Canada were startled in the middle of their workday (1:51 P.M. local time) by the sudden onset of moderate to strong ground shaking from a rare magnitude (M) 5.8 earthquake in central Virginia. Treating the shaking as if it were a fire drill, millions of workers in Washington, D. C., New York City, and other eastern cities hurriedly exited their buildings, exposing themselves to potentially greater danger from falling bricks and glass; "drop, cover, and hold" would have been a better response. Fortunately, the strong shaking stopped after about 5 seconds and did not cause widespread severe damage or serious injuries. The central Virginia earthquake, among the largest on the eastern seaboard during the approximately 400-year historic record, occurred as the result of reverse slip on a previously unrecognized north-to-northeast striking fault within the Central Virginia seismic zone (CVSZ) (Figure 1a). Many old faults are mapped in the CVSZ, yet no individual strands were previously confirmed to be active. However, persistent low-level seismicity has been observed during historical times, and instrumental recordings since about 1970 detect ongoing distributed seismicity within the CVSZ [Bollinger and Hopper, 1971], which has been identified by the U.S. Geological Survey (USGS) as an area of elevated earthquake hazard since 1976 [Algermissen and Perkins, 1976].

  3. From Multi-Sensors Observations Towards Cross-Disciplinary Study of Pre-Earthquake Signals. What have We Learned from the Tohoku Earthquake?

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Papadopoulos, G.; Kunitsyn, V.; Nesterov, I.; Hayakawa, M.; Mogi, K.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The lessons we have learned from the Great Tohoku EQ (Japan, 2011) how this knowledge will affect our future observation and analysis is the main focus of this presentation.We present multi-sensors observations and multidisciplinary research in our investigation of phenomena preceding major earthquakes. These observations revealed the existence of atmospheric and ionospheric phenomena occurring prior to theM9.0 Tohoku earthquake of March 11, 2011, which indicates s new evidence of a distinct coupling between the lithosphere and atmosphere/ionosphere, as related to underlying tectonic activity. Similar results have been reported before the catastrophic events in Chile (M8.8, 2010), Italy (M6.3, 2009) and Sumatra (M9.3, 2004). For the Tohoku earthquake, our analysis shows a synergy between several independent observations characterizing the state of the lithosphere /atmosphere coupling several days before the onset of the earthquakes, namely: (i) Foreshock sequence change (rate, space and time); (ii) Outgoing Long wave Radiation (OLR) measured at the top of the atmosphere; and (iii) Anomalous variations of ionospheric parameters revealed by multi-sensors observations. We are presenting a cross-disciplinary analysis of the observed pre-earthquake anomalies and will discuss current research in the detection of these signals in Japan. We expect that our analysis will shed light on the underlying physics of pre-earthquake signals associated with some of the largest earthquake events

  4. Lessons learned from the Japan earthquake and tsunami, 2011.

    Science.gov (United States)

    Fuse, Akira; Yokota, Hiroyuki

    2012-01-01

    On March 11, 2011, an earthquake occurred off the coast of Honshu, Japan. The quake was followed by a powerful tsunami that caused extensive damage to the east coast of the Tohoku and Kanto regions. This disaster destroyed the medical system in place and thus drastically reduced the ability of the healthcare system to handle the large number of casualties. During the initial response to this disaster, we participated in several types of outreach medical relief teams dispatched to the affected area from the day of the earthquake onwards. The ratio of persons injured to persons missing or dead for the 2011 Japan disaster (0.31: 5,994 to 19,371) was much lower than for the Indian Ocean Tsunami of 2004 in Thailand (1.01; 8,457 to 8,393) and for the Great Hanshin-Awaji Earthquake of 1995 in Japan (6.80; 43,792 to 6,437). The different ratios for the different types of disasters indicate that medical relief efforts in response to natural disasters should be tailored to the type of disaster to optimize the effectiveness of the response and prevent further deaths. From a medical viewpoint, unnecessary deaths must be prevented following natural disasters. Doing so requires appropriate information transmission and an understanding of the mission's overall and specific objectives: 1) rapid search and rescue; 2) early care in the field, evacuation centers, and primary clinics; 3) definitive evaluation at disaster base hospitals; and 4) proper evacuation to unaffected areas. We propose a descriptive device that can guide headquarters in dealing with the commonalities of a disaster.

  5. Multi-Sensors Observations of Pre-Earthquake Signals. What We Learned from the Great Tohoku Earthquake?

    Science.gov (United States)

    Ouzonounov, D.; Pulinets, S.; Papadopoulos, G.; Kunitsyn, V.; Nesterov, I.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The lessons learned from the Great Tohoku EQ (Japan, 2011) will affect our future observations and an analysis is the main focus of this presentation. Multi-sensors observations and multidisciplinary research is presented in our study of the phenomena preceding major earthquakes Our approach is based on a systematic analysis of several physical and environmental parameters, which been reported by others in connections with earthquake processes: thermal infrared radiation; temperature; concentration of electrons in the ionosphere; radon/ion activities; and atmospheric temperature/humidity [Ouzounov et al, 2011]. We used the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model, one of several possible paradigms [Pulinets and Ouzounov, 2011] to interpret our observations. We retrospectively analyzed the temporal and spatial variations of three different physical parameters characterizing the state of the atmosphere, ionosphere the ground surface several days before the March 11, 2011 M9 Tohoku earthquake Namely: (i) Outgoing Long wave Radiation (OLR) measured at the top of the atmosphere; (ii) Anomalous variations of ionospheric parameters revealed by multi-sensors observations; and (iii) The change in the foreshock sequence (rate, space and time); Our results show that on March 8th, 2011 a rapid increase of emitted infrared radiation was observed and an anomaly developed near the epicenter with largest value occurring on March 11 at 07.30 LT. The GPS/TEC data indicate an increase and variation in electron density reaching a maximum value on March 8. Starting from this day in the lower ionosphere there was also observed an abnormal TEC variation over the epicenter. From March 3 to 11 a large increase in electron concentration was recorded at all four Japanese ground-based ionosondes, which returned to normal after the main earthquake. We use the Japanese GPS network stations and method of Radio Tomography to study the spatiotemporal structure of ionospheric

  6. Near-edge x-ray absorption fine structure spectroscopy at atmospheric pressure with a table-top laser-induced soft x-ray source

    Energy Technology Data Exchange (ETDEWEB)

    Kühl, Frank-Christian, E-mail: Frank-christian.kuehl@mail.de; Müller, Matthias, E-mail: matthias.mueller@llg-ev.de; Schellhorn, Meike; Mann, Klaus [Laser-Laboratorium Göttingen e.V., Hans-Adolf-Krebs-Weg 1, D-37077 Göttingen (Germany); Wieneke, Stefan [Hochschule für angewandte Wissenschaft und Kunst, Von-Ossietzky-Str 99, D-37085 Göttingen (Germany); Eusterhues, Karin [Friedrich-Schiller-Universität Jena, Fürstengraben 1, D-07743 Jena (Germany)

    2016-07-15

    The authors present a table-top soft x-ray absorption spectrometer, accomplishing investigations of the near-edge x-ray absorption fine structure (NEXAFS) in a laboratory environment. The system is based on a low debris plasma ignited by a picosecond laser in a pulsed krypton gas jet, emitting soft x-ray radiation in the range from 1 to 5 nm. For absorption spectroscopy in and around the “water window” (2.3–4.4 nm), a compact helium purged sample compartment for experiments at atmospheric pressure has been constructed and tested. NEXAFS measurements on CaCl{sub 2} and KMnO{sub 4} samples were conducted at the calcium and manganese L-edges, as well as at the oxygen K-edge in air, atmospheric helium, and under vacuum, respectively. The results indicate the importance of atmospheric conditions for an investigation of sample hydration processes.

  7. Comparison of 3D, Assist-as-Needed Robotic Arm/Hand Movement Training Provided with Pneu-WREX to Conventional Table Top Therapy Following Chronic Stroke

    Science.gov (United States)

    Reinkensmeyer, David J.; Wolbrecht, Eric T.; Chan, Vicky; Chou, Cathy; Cramer, Steven C.; Bobrow, James E.

    2012-01-01

    Objective Robot-assisted movement training can help individuals with stroke reduce arm and hand impairment, but robot therapy is typically only about as effective as conventional therapy. Refining the way that robots assist during training may make them more effective than conventional therapy. Here we measured the therapeutic effect of a robot that required individuals with a stroke to achieve virtual tasks in three dimensions against gravity. Design The robot continuously estimated how much assistance patients needed to perform the tasks and provided slightly less assistance than needed in order to reduce patient slacking. Individuals with a chronic stroke (n = 26, baseline upper extremity Fugl-Meyer score = 23 ± 8) were randomized into two groups and underwent 24 one hour training sessions over 2 months. One group received the assist-as-needed robot training and the other received conventional table top therapy with the supervision of a physical therapist. Results Training helped both groups significantly reduce their motor impairment, as measured by the primary outcome measure, the Fugl-Meyer score, but the improvement was small (3.0 ± 4.9 points for robot therapy, versus 0.9 ± 1.7 for conventional therapy). There was a trend for greater reduction for the robot trained group (p = 0.07). The robot group largely sustained this gain at the three-month follow-up. The robot-trained group also experienced significant improvements in Box and Blocks score and hand grip strength, while the control group did not, but these improvements were not sustained at follow-up. In addition, the robot-trained group showed a trend toward greater improvement in sensory function, as measured by the Nottingham Sensory Test (p = 0.06). Conclusions These results suggest that, in patients with chronic stroke and moderate-severe deficits, assisting in three dimensional virtual tasks with an assist-as-needed controller may make robotic training more effective than conventional table top

  8. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    Science.gov (United States)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  9. Sustaining a biobank through a series of earthquake swarms: lessons learned from our New Zealand experience.

    Science.gov (United States)

    Morrin, Helen R; Robinson, Bridget A

    2013-08-01

    In the early hours of September 4, 2010, the city of Christchurch in New Zealand was awakened by a major magnitude 7.1 earthquake event that was the start of a series of earthquake swarms. By January 2012, the city had sustained over 10,000 earthquakes and aftershocks, including 4 major events. New Zealand is positioned along the geological Pacific Rim of Fire and is subject to volcanic and seismic movements. However, this series of earthquakes arose from a previously undetected fault that had been dormant for over 10,000 years. The impact on the city, businesses, and people of Christchurch has been profound. Sustaining our cancer biobank through this period has been extremely challenging, as our city's infrastructure including utilities, telecommunication, and transport facilities were fractured, buildings collapsed, and a National State of Emergency was declared. What had not been anticipated was that this impact would continue to be felt up to the present time. After each major earthquake event, the immediate focus of our response was to ensure the safety of all personnel. The secondary response was to ensure the continued preservation of stored specimens. Our third response was to reestablish operational processes without endangering staff. Our responses have been reviewed and lessons formulated that can be incorporated into biobank emergency response plans. They include operational aspects of equipment restraint, cryostorage, staff trauma, specimen relocation, legislation, and management of the repair processes. Emergency response planning for a biobank is a "best practice" standard. Future-proofing a biobank from a significant natural disaster such as a series of earthquake swarms is limited. However, lessons learned from our experience may help to mitigate the impact of future events within our global community.

  10. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    Science.gov (United States)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  11. An Optimized Table-Top Small-Angle X-ray Scattering Set-up for the Nanoscale Structural Analysis of Soft Matter

    KAUST Repository

    Sibillano, T.

    2014-11-10

    The paper shows how a table top superbright microfocus laboratory X-ray source and an innovative restoring-data algorithm, used in combination, allow to analyze the super molecular structure of soft matter by means of Small Angle X-ray Scattering ex-situ experiments. The proposed theoretical approach is aimed to restore diffraction features from SAXS profiles collected from low scattering biomaterials or soft tissues, and therefore to deal with extremely noisy diffraction SAXS profiles/maps. As biological test cases we inspected: i) residues of exosomes\\' drops from healthy epithelial colon cell line and colorectal cancer cells; ii) collagen/human elastin artificial scaffolds developed for vascular tissue engineering applications; iii) apoferritin protein in solution. Our results show how this combination can provide morphological/structural nanoscale information to characterize new artificial biomaterials and/or to get insight into the transition between healthy and pathological tissues during the progression of a disease, or to morphologically characterize nanoscale proteins, based on SAXS data collected in a room-sized laboratory.

  12. Detection of Urban Damage Using Remote Sensing and Machine Learning Algorithms: Revisiting the 2010 Haiti Earthquake

    Directory of Open Access Journals (Sweden)

    Austin J. Cooner

    2016-10-01

    Full Text Available Remote sensing continues to be an invaluable tool in earthquake damage assessments and emergency response. This study evaluates the effectiveness of multilayer feedforward neural networks, radial basis neural networks, and Random Forests in detecting earthquake damage caused by the 2010 Port-au-Prince, Haiti 7.0 moment magnitude (Mw event. Additionally, textural and structural features including entropy, dissimilarity, Laplacian of Gaussian, and rectangular fit are investigated as key variables for high spatial resolution imagery classification. Our findings show that each of the algorithms achieved nearly a 90% kernel density match using the United Nations Operational Satellite Applications Programme (UNITAR/UNOSAT dataset as validation. The multilayer feedforward network was able to achieve an error rate below 40% in detecting damaged buildings. Spatial features of texture and structure were far more important in algorithmic classification than spectral information, highlighting the potential for future implementation of machine learning algorithms which use panchromatic or pansharpened imagery alone.

  13. Integrating Machine Learning into a Crowdsourced Model for Earthquake-Induced Damage Assessment

    Science.gov (United States)

    Rebbapragada, Umaa; Oommen, Thomas

    2011-01-01

    On January 12th, 2010, a catastrophic 7.0M earthquake devastated the country of Haiti. In the aftermath of an earthquake, it is important to rapidly assess damaged areas in order to mobilize the appropriate resources. The Haiti damage assessment effort introduced a promising model that uses crowdsourcing to map damaged areas in freely available remotely-sensed data. This paper proposes the application of machine learning methods to improve this model. Specifically, we apply work on learning from multiple, imperfect experts to the assessment of volunteer reliability, and propose the use of image segmentation to automate the detection of damaged areas. We wrap both tasks in an active learning framework in order to shift volunteer effort from mapping a full catalog of images to the generation of high-quality training data. We hypothesize that the integration of machine learning into this model improves its reliability, maintains the speed of damage assessment, and allows the model to scale to higher data volumes.

  14. The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment

    International Nuclear Information System (INIS)

    Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.

    2008-01-01

    The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide

  15. The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment

    Science.gov (United States)

    Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.

    2008-07-01

    The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide

  16. A table-top LHC

    CERN Multimedia

    Barbara Warmbein

    2011-01-01

    Many years ago, when ATLAS was no more than a huge empty underground cavern and Russian artillery shell casings were being melted down to become part of the CMS calorimetry system, science photographer Peter Ginter started documenting the LHC’s progress. He was there when special convoys of equipment crossed the Jura at night, when cranes were lowering down detector slices and magnet coils were being wound in workshops. Some 18 years of LHC history have been documented by Ginter, and the result has just come out as a massive coffee table book full of double-page spreads of Ginter’s impressive images.   The new coffee table book, LHC: the Large Hadron Collider. Published by the Austrian publisher Edition Lammerhuber in cooperation with CERN and UNESCO Publishing, LHC: the Large Hadron Collider is an unusual piece in the company’s portfolio. As the publisher’s first science book, LHC: the Large Hadron Collider weighs close to five kilos and comes in a s...

  17. SU-F-J-28: Development of a New Imaging Filter to Remove the Shadows From the Carbon Fiber Grid Table Top

    Energy Technology Data Exchange (ETDEWEB)

    Maehana, W [Kanagawa Cancer Center, Yokohama, Kanagawa (Japan); Yokohama National University, Yokohama, kanagawa (Japan); Nagao, T [Yokohama National University, Yokohama, kanagawa (Japan)

    2016-06-15

    Purpose: For the image guided radiation therapy (IGRT), the shadows caused by the construction of the treatment couch top adversely affect the visual evaluation. Therefore, we developed the new imaging filter in order to remove the shadows. The performance of the new filter was evaluated using the clinical images. Methods: The new filter was composed of the band-pass filter (BPF) weighted by the k factor and the low-pass filter (LPF). In the frequency region, the stop bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the BPF, and the pass bandwidth were 8.3×10{sup 3} mm{sup −1} on u direction and 11.1×10{sup 3} mm{sup −1} on v direction for the LPF. After adding the filter, the shadows from the carbon fiber grid table top (CFGTT, Varian) on the kV-image was removed. To check the filter effect, we compared the clinical images, which are thorax and thoracoabdominal region, with to without the filter. The subjective evaluation tests was performed by adapting a three-point scale (agree, neither agree nor disagree, disagree) about the 15 persons in the department of radiation oncology. Results: We succeeded in removing all shadows of CFGTT using the new filter. This filter is very useful shown by the results of the subjective evaluation having the 23/30 persons agreed to the filtered clinical images. Conclusion: We concluded that the proposed method was useful tool for the IGRT and the new filter leads to improvement of the accuracy of radiation therapy.

  18. Audio-based, unsupervised machine learning reveals cyclic changes in earthquake mechanisms in the Geysers geothermal field, California

    Science.gov (United States)

    Holtzman, B. K.; Paté, A.; Paisley, J.; Waldhauser, F.; Repetto, D.; Boschi, L.

    2017-12-01

    The earthquake process reflects complex interactions of stress, fracture and frictional properties. New machine learning methods reveal patterns in time-dependent spectral properties of seismic signals and enable identification of changes in faulting processes. Our methods are based closely on those developed for music information retrieval and voice recognition, using the spectrogram instead of the waveform directly. Unsupervised learning involves identification of patterns based on differences among signals without any additional information provided to the algorithm. Clustering of 46,000 earthquakes of $0.3

  19. Lessons Learned about Best Practices for Communicating Earthquake Forecasting and Early Warning to Non-Scientific Publics

    Science.gov (United States)

    Sellnow, D. D.; Sellnow, T. L.

    2017-12-01

    Earthquake scientists are without doubt experts in understanding earthquake probabilities, magnitudes, and intensities, as well as the potential consequences of them to community infrastructures and inhabitants. One critical challenge these scientific experts face, however, rests with communicating what they know to the people they want to help. Helping scientists translate scientific information to non-scientists is something Drs. Tim and Deanna Sellnow have been committed to for decades. As such, they have compiled a host of data-driven best practices for communicating effectively to non-scientific publics about earthquake forecasting, probabilities, and warnings. In this session, they will summarize what they have learned as it may help earthquake scientists, emergency managers, and other key spokespersons share these important messages to disparate publics in ways that result in positive outcomes, the most important of which is saving lives.

  20. High-flux table-top soft x-ray source driven by sub-2-cycle, CEP stable, 1.85-μm 1-kHz pulses for carbon K-edge spectroscopy.

    Science.gov (United States)

    Cousin, S L; Silva, F; Teichmann, S; Hemmer, M; Buades, B; Biegert, J

    2014-09-15

    We report on the first table-top high-flux source of coherent soft x-ray radiation up to 400 eV, operating at 1 kHz. This source covers the carbon K-edge with a beam brilliance of (4.3±1.2)×10(15) photons/s/mm(2)/strad/10% bandwidth and a photon flux of (1.85±0.12)×10(7) photons/s/1% bandwidth. We use this source to demonstrate table-top x-ray near-edge fine-structure spectroscopy at the carbon K-edge of a polyimide foil and retrieve the specific absorption features corresponding to the binding orbitals of the carbon atoms in the foil.

  1. Lessons learned from the 12 May 2008 Wenchuan earthquake: Impact on industry

    Science.gov (United States)

    Krausmann, E.; Cruz, A. M.; Affeltranger, B.

    2009-04-01

    The earthquake that shook Wenchuan County in China's Sichuan Province on 12 May 2008 was a major event with a moment magnitude of MW = 7.9 and a depth of only 19 km. It caused a fault rupture of 270 km length and affected a total area of about 500,000 km2. With the intensity reaching XI in the region near the epicentre and peak ground acceleration values as high as 0.63g the earthquake killed almost 70,000 people, injured over 374,000 and rendered 5,000,000 homeless. Over 18,000 are still listed as missing. Prior to the earthquake the area was considered a region of moderate seismicity with a design intensity of 7. Sichuan Province is home to a significant proportion of Chinese chemical and nuclear industry and consequently has a very strong economy. The direct economic loss due to the earthquake amounts to over 1.1 billion Euros. In addition to economic damage there is also concern about earthquake-triggered damage to and destruction of industrial facilities housing or processing hazardous substances and the potential consequences of their release to man or the environment. In order to understand how well the chemical industry fared in the earthquake-affected areas a reconnaissance field trip was organised from 15-21 November, 2008, which included visits to industry in Deyang, Shifang, Mianzhu, Mianyang, Anxian and Dujiangyan. In total we collected information on earthquake effects at 18 industrial facilities. Lessons learned from this reconnaissance field trip confirm the devastating consequences that natural disasters can have on industrial facilities. In addition to casualties and environmental harm the economic losses due to damage, prolonged shut-down periods and business interruption are often ruinous and may result in lay-off of workers. In the case of the visited facilities the shut-down time was up to 6 months. Two facilities were damaged beyond repair and have resulted in significant ammonia, sulphuric acid and other releases that in addition to

  2. Natech Events of August 17, 1999 Kocaeli Earthquake: Aftermath and Lessons Learned

    Science.gov (United States)

    Girgin, Serkan

    2010-05-01

    Kocaeli earthquake (Mw 7.4) on August 17, 1999 was one of the most devastating natural disasters in the modern history of Turkey. Occurred at 03:02am local time, the earthquake resulted in about 17.500 fatalities and 44.000 injured, affected 15 million people with a total property damage of over 15 billion USD. The area struck by the earthquake is one of the industrial heartlands of the country; it is densely populated and heavily industrialized, accounting for 35% of the gross national product. The earthquake caused significant structural damage, machine and equipment loss at industrial facilities, which led to many Natech events ranging from small-sized toxic chemical releases to enormous fires. Among these events, two of them were especially remarkable due to their extent and consequences: the massive fire at the TUPRAS Izmit Refinery and the acrylonitrile (ACN) spill at the AKSA acrylic fiber production plant. TUPRAS Izmit Refinery, one of the four refineries in Turkey, was the industrial facility that suffered the most damage in the earthquake. Breaking of pipes at the port terminal caused significant amount of oil spill to the sea. Collapse of a stack initiated a fire at the crude oil unit, meanwhile sparks created by the bouncing of floating roofs initiated another fire at the naptha tank farm. The damage in the infrastructure and reduced human resources owing to the earthquake hindered response and fire fighting activities. All population in the vicinity of the refinery had been evacuated that prevented rescue activities from the debris. The fire at the tank farm lasted for 3 days and could only be extinguished by international cooperation. The refinery became operational within 2.5 months and reached its full capacity in 12 months at a cost of 57.8 million USD. The earthquake damaged three of the storage tanks at the AKSA acrylic fiber production plant and caused 6,400 tons of ACN, which is a highly flammable, toxic and carcinogenic, to be released into

  3. The natech events during the 17 August 1999 Kocaeli earthquake: aftermath and lessons learned

    Directory of Open Access Journals (Sweden)

    S. Girgin

    2011-04-01

    Full Text Available Natural-hazard triggered technological accidents (natechs at industrial facilities have been recognized as an emerging risk. Adequate preparedness, proper emergency planning, and effective response are crucial for the prevention of natechs and mitigation of the consequences. Under the conditions of a natural disaster, the limited resources, the possible unavailability of mitigation measures, and the lack of adequate communication complicate the management of natechs. The analysis of past natechs is crucial for learning lessons and for preventing or preparing for future natechs. The 17 August 1999, Kocaeli earthquake, which was a devastating disaster hitting one of the most industrialized regions of Turkey, offers opportunities in this respect. Among many natechs that occurred due to the earthquake, the massive fire at the TUPRAS Izmit refinery and the acrylonitrile spill at the AKSA acrylic fiber production plant were especially important and highlight problems in the consideration of natechs in emergency planning, response to industrial emergencies during natural hazards, and information to the public during and following the incidents. The analysis of these events shows that even the largest and seemingly well-prepared facilities can be vulnerable to natechs if risks are not considered adequately.

  4. Don't forget about the Christchurch earthquake: Lessons learned from this disaster

    Science.gov (United States)

    Hamburger, Michael W.; Mooney, Walter D.

    2011-01-01

    In the aftermath of the devastating magnitude-9.0 earthquake and tsunami that struck the Tohoku region of Japan on March 11, attention quickly turned away from a much smaller, but also highly destructive earthquake that struck the city of Christchurch, New Zealand, just a few weeks earlier, on Feb. 22. Both events are stark reminders of human vulnerability to natural disasters and provide a harsh reality check: Even technologically advanced countries with modern building codes are not immune from earthquake disasters. The Christchurch earthquake carried an additional message: Urban devastation can be triggered even by moderate-sized earthquakes.

  5. An Arduino project to record ground motion and to learn on earthquake hazard at high school

    Science.gov (United States)

    Saraò, Angela; Barnaba, Carla; Clocchiatti, Marco; Zuliani, David

    2015-04-01

    Through a multidisciplinary work that integrates Technology education with Earth Sciences, we implemented an educational program to raise the students' awareness of seismic hazard and to disseminate good practices of earthquake safety. Using free software and low-cost open hardware, the students of a senior class of the high school Liceo Paschini in Tolmezzo (NE Italy) implemented a seismograph using the Arduino open-source electronics platform and the ADXL345 sensors to emulate a low cost seismometer (e.g. O-NAVI sensor of the Quake-Catcher Network, http://qcn.stanford.edu). To accomplish their task the students were addressed to use the web resources for technical support and troubleshooting. Shell scripts, running on local computers under Linux OS, controlled the process of recording and display data. The main part of the experiment was documented using the DokuWiki style. Some propaedeutic lessons in computer sciences and electronics were needed to build up the necessary skills of the students and to fill in the gap of their background knowledge. In addition lectures by seismologists and laboratory activity allowed the class to exploit different aspects of the physics of the earthquake and particularly of the seismic waves, and to become familiar with the topics of seismic hazard through an inquiry-based learning. The Arduino seismograph achieved can be used for educational purposes and it can display tremors on the local network of the school. For sure it can record the ground motion due to a seismic event that can occur in the area, but further improvements are necessary for a quantitative analysis of the recorded signals.

  6. More Schooling, Less Youth Crime? Learning from an Earthquake in Japan

    OpenAIRE

    Aoki, Yu

    2014-01-01

    This paper aims to identify the causal effect of schooling on youth crime. To identify the causal effect, I use the policy interventions that occurred after the Kobe earthquake that hit Japan in 1995 as a natural experiment inducing exogenous variation in schooling. Based on a comparison of the arrest rates between municipalities exposed to similar degrees of earthquake damage but with and without the policy interventions, I find that a higher high school participation rate reduces juvenile a...

  7. Lessons learned from the total evacuation of a hospital after the 2016 Kumamoto Earthquake.

    Science.gov (United States)

    Yanagawa, Youichi; Kondo, Hisayoshi; Okawa, Takashi; Ochi, Fumio

    The 2016 Kumamoto Earthquakes were a series of earthquakes that included a foreshock earthquake (magnitude 6.2) on April 14 and a main shock (magnitude 7.0) on April 16, 2016. A number of hospitals in Kumamoto were severely damaged by the two major earthquakes and required total evacuation. The authors retrospectively analyzed the activity data of the Disaster Medical Assistance Teams using the Emergency Medical Information System records to investigate the cases in which the total evacuation of a hospital was attempted following the 2016 Kumamoto Earthquake. Total evacuation was attempted at 17 hospitals. The evacuation of one of these hospitals was canceled. Most of the hospital buildings were more than 20 years old. The danger of collapse was the most frequent reason for evacuation. Various transportation methods were employed, some of which involved the Japan Ground Self-Defense Force; no preventable deaths occurred during transportation. The hospitals must now be renovated to improve their earthquake resistance. The coordinated and combined use of military and civilian resources is beneficial and can significantly reduce human suffering in large-scale disasters.

  8. Care of children in a natural disaster: lessons learned from the Great East Japan earthquake and tsunami.

    Science.gov (United States)

    Yonekura, Takeo; Ueno, Shigeru; Iwanaka, Tadashi

    2013-10-01

    The Great East Japan earthquake was one of the most devastating natural disasters ever to hit Japan. We present features of the disaster and the radioactive accident in Fukushima. About 19,000 are dead or remain missing mainly due to the tsunami, but children accounted for only 6.5% of the deaths. The Japanese Society of Pediatric Surgeons set up the Committee of Aid for Disaster, and collaborated with the Japanese Society of Emergency Pediatrics to share information and provide pediatric medical care in the disaster area. Based on the lessons learned from the experiences, the role of pediatric surgeons and physicians in natural disasters is discussed.

  9. Preliminary Report Summarizes Tsunami Impacts and Lessons Learned from the September 7, 2017, M8.1 Tehuantepec Earthquake

    Science.gov (United States)

    Wilson, R. I.; Ramirez-Herrera, M. T.; Dengler, L. A.; Miller, K.; LaDuke, Y.

    2017-12-01

    The preliminary tsunami impacts from the September 7, 2017, M8.1 Tehuantepec Earthquake have been summarized in the following report: https://www.eeri.org/wp-content/uploads/EERI-Recon-Rpt-090717-Mexico-tsunami_fn.pdf. Although the tsunami impacts were not as significant as those from the earthquake itself (98 fatalities and 41,000 homes damaged), the following are highlights and lessons learned: The Tehuantepec earthquake was one of the largest down-slab normal faulting events ever recorded. This situation complicated the tsunami forecast since forecast methods and pre-event modeling are primarily associated with megathrust earthquakes where the most significant tsunamis are generated. Adding non-megathrust source modeling to the tsunami forecast databases of conventional warning systems should be considered. Offshore seismic and tsunami hazard analyses using past events should incorporate the potential for large earthquakes occurring along sources other than the megathrust boundary. From an engineering perspective, initial reports indicate there was only minor tsunami damage along the Mexico coast. There was damage to Marina Chiapas where floating docks overtopped their piles. Increasing pile heights could reduce the potential for damage to floating docks. Tsunami warning notifications did not get to the public in time to assist with evacuation. Streamlining the messaging in Mexico from the warning system directly to the public should be considered. And, for local events, preparedness efforts should place emphasis on responding to feeling the earthquake and not waiting to be notified. Although the U.S. tsunami warning centers were timely with their international and domestic messaging, there were some issues with how those messages were presented and interpreted. The use of a "Tsunami Threat" banner on the new main warning center website created confusion with emergency managers in the U.S. where no tsunami threat was expected to exist. Also, some U.S. states and

  10. Machine-Learning Inspired Seismic Phase Detection for Aftershocks of the 2008 MW7.9 Wenchuan Earthquake

    Science.gov (United States)

    Zhu, L.; Li, Z.; Li, C.; Wang, B.; Chen, Z.; McClellan, J. H.; Peng, Z.

    2017-12-01

    Spatial-temporal evolution of aftershocks is important for illumination of earthquake physics and for rapid response of devastative earthquakes. To improve aftershock catalogs of the 2008 MW7.9 Wenchuan earthquake in Sichuan, China, Alibaba cloud and China Earthquake Administration jointly launched a seismological contest in May 2017 [Fang et al., 2017]. This abstract describes how we handle this problem in this competition. We first used Short-Term Average/Long-Term Average (STA/LTA) and Kurtosis function to obtain over 55000 candidate phase picks (P or S). Based on Signal to Noise Ratio (SNR), about 40000 phases (P or S) are selected. So far, these 40000 phases have a hit rate of 40% among the manually picks. The causes include that 1) there exist false picks (neither P nor S); 2) some P and S arrivals are mis-labeled. To improve our results, we correlate the 40000 phases over continuous waveforms to obtain the phases missed by during the first pass. This results in 120,000 events. After constructing an affinity matrix based on the cross-correlation for newly detected phases, subspace clustering methods [Vidal 2011] are applied to group those phases into separated subspaces. Initial results show good agreement between empirical and clustered labels of P phases. Half of the empirical S phases are clustered into the P phase cluster. This may be a combined effect of 1) mislabeling isolated P phases to S phases and 2) clustering errors due to a small incomplete sample pool. Phases that were falsely detected in the initial results can be also teased out. To better characterize P and S phases, our next step is to apply subspace clustering methods directly to the waveforms, instead of using the cross-correlation coefficients of detected phases. After that, supervised learning, e.g., a convolutional neural network, can be employed to improve the pick accuracy. Updated results will be presented at the meeting.

  11. What Can We Learn from a Simple Physics-Based Earthquake Simulator?

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2018-03-01

    Physics-based earthquake simulators are becoming a popular tool to investigate on the earthquake occurrence process. So far, the development of earthquake simulators is commonly led by the approach "the more physics, the better". However, this approach may hamper the comprehension of the outcomes of the simulator; in fact, within complex models, it may be difficult to understand which physical parameters are the most relevant to the features of the seismic catalog at which we are interested. For this reason, here, we take an opposite approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple simulator may be more informative than a complex one for some specific scientific objectives, because it is more understandable. Our earthquake simulator has three main components: the first one is a realistic tectonic setting, i.e., a fault data set of California; the second is the application of quantitative laws for earthquake generation on each single fault, and the last is the fault interaction modeling through the Coulomb Failure Function. The analysis of this simple simulator shows that: (1) the short-term clustering can be reproduced by a set of faults with an almost periodic behavior, which interact according to a Coulomb failure function model; (2) a long-term behavior showing supercycles of the seismic activity exists only in a markedly deterministic framework, and quickly disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault; (3) faults that are strongly coupled in terms of Coulomb failure function model are synchronized in time only in a marked deterministic framework, and as before, such a synchronization disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault. Overall, the results show that even in a simple and perfectly known earthquake occurrence world, introducing a small degree of

  12. Lessons Learned from Data Management Activities after the Great East Japan Earthquake in March 2011

    Directory of Open Access Journals (Sweden)

    A Kitamoto

    2013-02-01

    Full Text Available This paper summarizes our effort towards managing the multi-disciplinary disaster-related data from the Great East Japan Earthquake, which happened on March 11, 2011 off the coast of Northeast Japan. This earthquake caused the largest tsunami in the recorded history of Japan, killed many people along the coast, and caused a nuclear disaster in Fukushima, which continues to affect a large area of Japan. Just after the earthquake, we started crisis response data management activities to provide useful information for supporting disaster response and recovery. This paper introduces the various types of datasets we made from the viewpoint of data management processing and draws lessons from our post-disaster activities.

  13. Learning from Abruzzo earthquake buildings behaviour seen from the Building According-to-the-book perspectives

    International Nuclear Information System (INIS)

    Bazurro, P.; Benedettini, F.; Clemente, P.; Salvatori, A.

    2009-01-01

    A brief description, with the related photographs, is reported on the effects of the recent Abruzzo earthquake on buildings, particularly reinforced-concrete and masonry buildings. Following an overview of the technical building codes particularly focused on the area of L'Aquila, mention is made of cultural heritage structures. [it

  14. Schools "as" Communities and "for" Communities: Learning from the 2010-2011 New Zealand Earthquakes

    Science.gov (United States)

    Mutch, Carol

    2016-01-01

    The author followed five primary (elementary) schools over three years as they responded to and began to recover from the 2010-2011 earthquakes in and around the city of Christchurch in the Canterbury region of New Zealand. The purpose was to capture the stories for the schools themselves, their communities, and for New Zealand's historical…

  15. DOE handbook table-top needs analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-03-01

    Purpose of this handbook is to establish guidelines for training personnel in developing training for operation, maintenance, and technical support personnel at DOE nuclear facilities. Information from this handbook can be used to conduct needs analysis as the information and methods apply. Operating contractors are encouraged to use good practices selectively in developing or improving programs to meet the specific needs of their facility.

  16. L'Aquila's reconstruction challenges: has Italy learned from its previous earthquake disasters?

    Science.gov (United States)

    Ozerdem, Alpaslan; Rufini, Gianni

    2013-01-01

    Italy is an earthquake-prone country and its disaster emergency response experiences over the past few decades have varied greatly, with some being much more successful than others. Overall, however, its reconstruction efforts have been criticised for being ad hoc, delayed, ineffective, and untargeted. In addition, while the emergency relief response to the L'Aquila earthquake of 6 April 2009-the primary case study in this evaluation-seems to have been successful, the reconstruction initiative got off to a very problematic start. To explore the root causes of this phenomenon, the paper argues that, owing to the way in which Italian Prime Minister Silvio Berlusconi has politicised the process, the L'Aquila reconstruction endeavour is likely to suffer problems with local ownership, national/regional/municipal coordination, and corruption. It concludes with a set of recommendations aimed at addressing the pitfalls that may confront the L'Aquila reconstruction process over the next few years. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  17. Lessons learned from the Youngstown, Ohio induced earthquake sequence from January 2011 to January 2012

    Directory of Open Access Journals (Sweden)

    A.P. Morris

    2017-10-01

    Full Text Available The Youngstown earthquake sequence of 2011 is one of the clearest examples of inadvertently induced seismicity for which detailed documentation is available. In this paper, we investigate (i likely stress states in the vicinity of the injection well, (ii a range of likely permeability scenarios, and (iii relatively simple methods by which induced seismicity can be evaluated and mitigated. We use relocated hypocenters from the seismic sequence to construct a basement fault structure, which is then used to serve as a reference surface within the basement, and on which we calculate the effects of pore pressure changes induced by the injection activities of the Northstar #1 injection well. We also deduce an in situ (pre-injection strike-slip stress regime, where σ2 ≈ σ3, and it is consistent with both recent earthquake data and published stress estimates for the region. If the reactivation characteristics of the basement are known or assumed, a critical or threshold slip tendency can be determined and the basement faults can be analyzed for the likelihood of reactivation in a perturbed pore pressure field. Comparison of well injection pressures and simulated pore pressure perturbations within the basement below the injection well indicates that permeability anisotropy is necessary to generate sufficient pore pressure perturbation to induce fault reactivation. Simulations of the well's injection history show that our estimate of in situ stress state, coupled with a highly anisotropic permeability structure, can generate sufficient pore pressure perturbation on the inferred basement structure to cause reactivation, potentially slipping an area of approximately 4 × 105 m2.

  18. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  19. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  20. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    Science.gov (United States)

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful.

  1. Inexpressible Memories and Learning for Reconstruction: Between the Major Earthquake Disasters in Postwar in Japan

    Science.gov (United States)

    Yamazumi, Katsuhiro

    2013-01-01

    Learning for disaster reconstruction carried out by teachers and children in schools faces the fundamental contradiction of how tragic memories leaving deep scars can be told and shared, and the attempts to deal with this problem. In this paper, in order to approach the issue of whether an educational practice which overcomes this contradiction is…

  2. Lessons Learned Preparing Volunteer Midwives for Service in Haiti: After the Earthquake.

    Science.gov (United States)

    Floyd, Barbara O'Malley

    2013-01-01

    Midwives for Haiti is an organization that focuses on the education and training of skilled birth attendants in Haiti, a country with a high rate of maternal and infant mortality and where only 26% of births are attended by skilled health workers. Following the 2010 earthquake, Midwives for Haiti received requests to expand services and numerous professional midwives answered the call to volunteer. This author was one of those volunteers. The purpose of the study was: 1) to develop a description of the program's strengths and its deficits in order to determine if there was a need to improve the preparation of volunteers prior to service and 2) to make recommendations aimed at strengthening the volunteers' contributions to the education of Haiti and auxiliary midwives. Three distinct but closely related questionnaires were developed to survey Haitian students, staff midwives, and volunteers who served with Midwives for Haiti. Questions were designed to elicit information about how well the volunteers were prepared for their experience, the effectiveness of translation services, and suggestions for improving the preparation of volunteers and strengthening the education program. Analysis of the surveys of volunteers, staff, midwives, and the Haitian students generated several common themes. The 3 groups agreed that the volunteers made an effective contribution to the program of education and that the volunteer midwives need more preparation prior to serving in Haiti. The 3 groups also agreed on the need for better translators and recommended more structure to the education program. The results of this study are significant to international health care organizations that use volunteer health care professionals to provide services. The results support a growing body of knowledge that international health aid organizations may use to strengthen the preparation, support, and effectiveness of volunteer health providers.

  3. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  4. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  5. Earthquake Facts

    Science.gov (United States)

    ... estimated 830,000 people. In 1976 another deadly earthquake struck in Tangshan, China, where more than 250,000 people were killed. Florida and North Dakota have the smallest number of earthquakes in the United States. The deepest earthquakes typically ...

  6. Preliminary findings and lessons learned from the 16 July 2007 earthquake at Kashiwazaki-Kariwa NPP- 'The Niigataken Chuetsu-Oki earthquake, Kashiwazaki-Kariwa NPP and Tokyo, Japan, 6-10 August 2007. Mission report. V. 1

    International Nuclear Information System (INIS)

    2007-01-01

    Upon request from the Government of Japan an IAEA expert mission was conducted at the Kashiwazaki-Kariwa NPP following a strong earthquake that affected the plant on 16 July 2007. The objective, as agreed with the Japanese counterpart, was to conduct a fact finding mission and to identify the preliminary lessons learned that might have implications for the international nuclear safety regime. Although the Niigataken Chuetsu-Oki earthquake on 16 July 2007 significantly exceeded the level of the seismic input taken into account in the design of the plant, the installation behaved in a safe manner, during and after the earthquake. In particular, the automatic shutdown of the reactors of Units 3, 4 and 7, which were operating at full power, and of the reactor of Unit 2, which was in the start up state, were performed successfully. Based on the reports from experts from the Tokyo Electric Power Company (TEPCO) and the limited but representative plant walkdowns and visual observations performed by the IAEA team, safety related structures, systems and components of the plant seem to be in a much better general condition than might be expected for such a strong earthquake, and there is no visible significant damage. This is probably due to the conservatisms introduced at different stages of the design process. The combined effects of these conservatisms were apparently sufficient to compensate for uncertainties in the data and methods available at the time of the design of the plant, which led to the underestimation of the original seismic input. However, important components like the reactor vessels, the core internals and the fuel elements have not yet been examined and in-depth inspections are still to be performed. On the other hand, non-safety related structures, systems and components were affected by significant damage such as soil and anchorage failures and oil leakages. A re-evaluation of the seismic safety the Kashiwazika-Kariwa NPP needs to be done with account

  7. Lessons learned from the Great East Japan Earthquake: The need for disaster preparedness in the area of disaster mental health for children.

    Science.gov (United States)

    Kozu, Shuei; Homma, Hiroaki

    2014-01-01

    The Great East Japan Earthquake on March 11, 2011 brought unprecedented challenges to individuals, families, and communities of the Tohoku region in Japan. Children are especially vulnerable to the postdisaster risk factors that impact their ability to heal. The destruction of the infrastructure by the disasters made it more challenging to reach out to children in an area where the stigma against mental illness is persistent. The authors share their experiences, what they heard from patients, and their reflections on lessons learned. The authors recommend the development of a coordinated mental health response system in preparation for the next disaster.

  8. Undead earthquakes

    Science.gov (United States)

    Musson, R. M. W.

    This short communication deals with the problem of fake earthquakes that keep returning into circulation. The particular events discussed are some very early earthquakes supposed to have occurred in the U.K., which all originate from a single enigmatic 18th century source.

  9. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  10. Framing a Conflict! How Media Report on Earthquake Risks Caused by Gas Drilling: A Longitudinal Analysis Using Machine Learning Techniques of Media Reporting on Gas Drilling from 1990 to 2015

    NARCIS (Netherlands)

    Opperhuizen, A.E. (Alette Eva); K. Schouten (Kim); E-H. Klijn (Erik-Hans)

    2018-01-01

    textabstractUsing a new analytical tool, supervised machine learning (SML), a large number of newspaper articles is analysed to answer the question how newspapers frame the news of public risks, in this case of earthquakes caused by gas drilling in The Netherlands. SML enabled the study of 2265 news

  11. The use of volunteer interpreters during the 201 0 Haiti earthquake: lessons learned from the USNS COMFORT Operation Unified Response Haiti.

    Science.gov (United States)

    Powell, Clydette; Pagliara-Miller, Claire

    2012-01-01

    On January 12, 2010, a 7.0 magnitude Richter earthquake devastated Haiti, leading to the world's largest humanitarian effort in 60 years. The catastrophe led to massive destruction of homes and buildings, the loss of more than 200,000 lives, and overwhelmed the host nation response and its public health infrastructure. Among the many responders, the United States Government acted immediately by sending assistance to Haiti including a naval hospital ship as a tertiary care medical center, the USNS COMFORT. To adequately respond to the acute needs of patients, healthcare professionals on the USNS COMFORT relied on Haitian Creole-speaking volunteers who were recruited by the American Red Cross (ARC). These volunteers complemented full-time Creole-speaking military staff on board. The ARC provided 78 volunteers who were each able to serve up to 4 weeks on board. Volunteers' demographics, such as age and gender, as well as linguistic skills, work background, and prior humanitarian assistance experience varied. Volunteer efforts were critical in assisting with informed consent for surgery, family reunification processes, explanation of diagnosis and treatment, comfort to patients and families in various stages of grieving and death, and helping healthcare professionals to understand the cultural context and sensitivities unique to Haiti. This article explores key lessons learned in the use of volunteer interpreters in earthquake disaster relief in Haiti and highlights the approaches that optimize volunteer services in such a setting, and which may be applicable in similar future events.

  12. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  13. The development of damage identification methods for buildings with image recognition and machine learning techniques utilizing aerial photographs of the 2016 Kumamoto earthquake

    Science.gov (United States)

    Shohei, N.; Nakamura, H.; Fujiwara, H.; Naoichi, M.; Hiromitsu, T.

    2017-12-01

    It is important to get schematic information of the damage situation immediately after the earthquake utilizing photographs shot from an airplane in terms of the investigation and the decision-making for authorities. In case of the 2016 Kumamoto earthquake, we have acquired more than 1,800 orthographic projection photographs adjacent to damaged areas. These photos have taken between April 16th and 19th by airplanes, then we have distinguished damages of all buildings with 4 levels, and organized as approximately 296,000 GIS data corresponding to the fundamental Geospatial data published by Geospatial Information Authority of Japan. These data have organized by effort of hundreds of engineers. However, it is not considered practical for more extensive disasters like the Nankai Trough earthquake by only human powers. So, we have been developing the automatic damage identification method utilizing image recognition and machine learning techniques. First, we have extracted training data of more than 10,000 buildings which have equally damage levels divided in 4 grades. With these training data, we have been raster scanning in each scanning ranges of entire images, then clipping patch images which represents damage levels each. By utilizing these patch images, we have been developing discriminant models by two ways. One is a model using the Support Vector Machine (SVM). First, extract a feature quantity of each patch images. Then, with these vector values, calculate the histogram density as a method of Bag of Visual Words (BoVW), then classify borders with each damage grades by SVM. The other one is a model using the multi-layered Neural Network. First, design a multi-layered Neural Network. Second, input patch images and damage levels based on a visual judgement, and then, optimize learning parameters with error backpropagation method. By use of both discriminant models, we are going to discriminate damage levels in each patches, then create the image that shows

  14. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  15. Earthquake Hazard Mitigation Strategy in Indonesia

    Science.gov (United States)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  16. Seismic measures and defence in depth of nuclear power plant. Lessons learned from the great east Japan earthquake

    International Nuclear Information System (INIS)

    Ochiai, Kanehiro

    2011-01-01

    The Great East Japan Earthquake occurred in March 11, 2011 brought about severe accident at nuclear power plant, which gave significant lessons to nuclear experts concerned with safety measures. Concepts of defence in depth was basic philosophy to assure safety of nuclear power plant even against uncertainties exceeding design basis. This concept consisted of prevention, monitoring, and action to mitigate consequences of failures such as a series of physical barriers between the reactor core and the environment, which were called multiple safety systems, each with backup and designed to accommodate human error. As for natural disaster, depth of recognition of characteristic of natural phenomena and its effect and engineering judgment was of prime importance. Different waveforms of ground motion at Fukushima and Onagawa at the Great East Japan Earthquake showed that design ground motion should have large uncertainties. To cope with uncertainties of ground motion, robust seismic measures based on experience were such as design of static seismic intensity and rigid structure of natural period less than 0.1 sec. As for tsunami, defence in depth measures were prepared for the cooling of reactor core, spent fuel and related electric generation equipment with taking into account 1) time lag between tsunami generation and arrival, 2) tsunami affected area could be limited by coastal levee or anti-inundation measure, 3) system redundancy could be assured by different locations of equipments and 4) repair works could be done by shipment of replacement equipment from outside due to limitation of affected regional area. Success examples of Onagawa, Tokai unit 2, Fukushima Daiichi unit 6 and Fukushima Daini Nuclear Power Plants could suggest definite tsunami defence in depth measures. Containment vent system as final heat sink and emergency condenser as reactor core cooling at outage should be properly utilized for Fukushima Daiichi unit 1 Nuclear Power Plant. (T. Tanaka)

  17. Strategic crisis and risk communication during a prolonged natural hazard event: lessons learned from the Canterbury earthquake sequence

    Science.gov (United States)

    Wein, A. M.; Potter, S.; Becker, J.; Doyle, E. E.; Jones, J. L.

    2015-12-01

    While communication products are developed for monitoring and forecasting hazard events, less thought may have been given to crisis and risk communication plans. During larger (and rarer) events responsible science agencies may find themselves facing new and intensified demands for information and unprepared for effectively resourcing communications. In a study of the communication of aftershock information during the 2010-12 Canterbury Earthquake Sequence (New Zealand), issues are identified and implications for communication strategy noted. Communication issues during the responses included reliability and timeliness of communication channels for immediate and short decision time frames; access to scientists by those who needed information; unfamiliar emergency management frameworks; information needs of multiple audiences, audience readiness to use the information; and how best to convey empathy during traumatic events and refer to other information sources about what to do and how to cope. Other science communication challenges included meeting an increased demand for earthquake education, getting attention on aftershock forecasts; responding to rumor management; supporting uptake of information by critical infrastructure and government and for the application of scientific information in complex societal decisions; dealing with repetitive information requests; addressing diverse needs of multiple audiences for scientific information; and coordinating communications within and outside the science domain. For a science agency, a communication strategy would consider training scientists in communication, establishing relationships with university scientists and other disaster communication roles, coordinating messages, prioritizing audiences, deliberating forecasts with community leaders, identifying user needs and familiarizing them with the products ahead of time, and practicing the delivery and use of information via scenario planning and exercises.

  18. Learning Change from Synthetic Aperture Radar Images: Performance Evaluation of a Support Vector Machine to Detect Earthquake and Tsunami-Induced Changes

    Directory of Open Access Journals (Sweden)

    Marc Wieland

    2016-09-01

    Full Text Available This study evaluates the performance of a Support Vector Machine (SVM classifier to learn and detect changes in single- and multi-temporal X- and L-band Synthetic Aperture Radar (SAR images under varying conditions. The purpose is to provide guidance on how to train a powerful learning machine for change detection in SAR images and to contribute to a better understanding of potentials and limitations of supervised change detection approaches. This becomes particularly important on the background of a rapidly growing demand for SAR change detection to support rapid situation awareness in case of natural disasters. The application environment of this study thus focuses on detecting changes caused by the 2011 Tohoku earthquake and tsunami disaster, where single polarized TerraSAR-X and ALOS PALSAR intensity images are used as input. An unprecedented reference dataset of more than 18,000 buildings that have been visually inspected by local authorities for damages after the disaster forms a solid statistical population for the performance experiments. Several critical choices commonly made during the training stage of a learning machine are being assessed for their influence on the change detection performance, including sampling approach, location and number of training samples, classification scheme, change feature space and the acquisition dates of the satellite images. Furthermore, the proposed machine learning approach is compared with the widely used change image thresholding. The study concludes that a well-trained and tuned SVM can provide highly accurate change detections that outperform change image thresholding. While good performance is achieved in the binary change detection case, a distinction between multiple change classes in terms of damage grades leads to poor performance in the tested experimental setting. The major drawback of a machine learning approach is related to the high costs of training. The outcomes of this study, however

  19. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  20. Regulatory point of view on Hengchun earthquake

    International Nuclear Information System (INIS)

    Niu, H.C.; Hsu, M.T.; Chen, Y.B.

    2008-01-01

    At the night of December 26th, 2006, a series of earthquakes struck Hengchun area where Maanshan NPS (MNPS) is located. Two main earthquakes with magnitude of 7.0 (Richters scale) occurred at 20:26 and 20:34 respectively. The epicenter of 20:34 earthquake, which was closer to the seashore than 20:26 earthquake, located at 33.5 Km west from MNPS and 50.2 Km depth down the surface. Before the earthquake, both MNPS units were at rated power operation. The unit no.2 operators tripped Reactor manually due to high vibration alarms from reactor coolant pumps and main turbine. While unit no.1 operators had decided to take the same action, the intensity of earthquake became less and less, so the shift supervisor made decision to keep unit no.1 in operation. The maximum peak ground acceleration recorded by MNPS seismic monitoring system was 0.16g which was still under MNPS seismic design basis, safe shutdown earthquake (SEE: 0.4g) and operating basis earthquake (OBE: 0.2g). The post-earthquake inspection of both units showed that there was no major damage on all SSCs. It still was the strongest earthquake which have ever been recorded in Taiwan's NPS site area since 1978, the first nuclear power station declared commercial operation. From regulatory point of view, it is important by taking account of the experience and lessons learned from Hengchun Earthquake. Especially, the training requirements of operators, the standard operating procedures during and after the earthquake need to be re-evaluated to enhance the ability to prevent the hazard during an earthquake event. (author)

  1. Darwin's earthquake.

    Science.gov (United States)

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  2. We Need More Focus On Pre-Disaster Preparedness: Early Lessons Learned From Recent Earthquakes in Northwest of Iran

    Directory of Open Access Journals (Sweden)

    Abdolreza Shaghaghi

    2012-12-01

    Full Text Available Dear Editor-in-ChiefTwo strong earthquakes with the magnitude of 6.4 and 6.3 at a depth of 9.9 km that rattled Iran’s northwest region within 60 km of Tabriz, the capital city of East Azerbaijan province on August 11, 2012 caused extensive damage in about 1000 villages, killed at least 258 and injured 1380 people. The quakes most severely affected villages close to three impacted towns in the disaster area; Varzegan, Ahar and Heris. Some of the villages were hit are in remote areas with limited access to transport routes.Within early hours aftermath of the twin devastating incidents ordinary people and those who had relatives in the affected area rushed towards the region to salvage victims mainly by their own cars. Independent groups such as small units from armed forces were also sent to the region to support rescue operation. Some of the survivors meanwhile, tried to transfer severely injured survivors to nearby hospitals and even to the central hospital in Tabriz using public transportation facilities e.g. taxis, vans or any vehicle available at the time. All these unplanned efforts created traffic jams on the roads leading to the disaster area and delayed rescue operation by trained staff.Now after the earthquakes that rumbled through the disaster area, about 36,000 quake-stricken people were given shelter and are being provided with their basic needs in makeshift camps. Humanitarian aids are reaching affected zone from all over the country and internationally but there are inadequacies in proper distribution of food stuff and equipments among the survivors. Piles of water bottles in front of tents which left in the heat under the sunshine, clothes and canned foods which were distributed by volunteers and are more than current needs of the victims are observable in the disaster area. This is while, lack of sufficient supply of drinking water, canned foods and portable washrooms were reported by the authorities in the first days after the

  3. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  4. Tsunami field survey in French Polynesia of the 2015 Chilean earthquake Mw = 8.2 and what we learned.

    Science.gov (United States)

    Jamelot, Anthony; Reymond, Dominique; Savigny, Jonathan; Hyvernaud, Olivier

    2016-04-01

    The tsunami generated by the earthquake of magnitude Mw=8.2 near the coast of central Chile on the 16th September 2015 was observed on 7 tide gauges distributed over the five archipelagoes composing French Polynesia, a territory as large as Europe. We'll sum up all the observations of the tsunami and the field survey done in Tahiti (Society islands) and Hiva-Oa (Marquesas islands) to evaluate the preliminary tsunami forecast tool (MERIT) and the detailed tsunami forecast tool (COASTER) of the French Polynesian Tsunami Warning Center. The preliminary tool forecasted a maximal tsunami height between 0.5m to 2.3 m all over the Marquesas Islands. But only the island of Hiva-Oa had a tsunami forecast greater than 1 meter especially in the Tahauku Bay well known for its local response due to its resonance properties. In Tahauku bay, the tide gauge located at the entrance of the bay recorded a maximal tsunami height above mean sea level ~ 1.7 m; and we measured at the bottom of the bay a run-up about 2.8 m at 388 m inland from the shoreline in the river bed, and a run-up of 2.5 m located 155 m inland. The multi-grid simulation over Tahiti was done one hour after the origin time of the earthquake and gave a very localized tsunami impact on the North shore. Our forecast indicated an inundation about 10 m inland that lead Civil Authorities to evacuate 6 houses. It was the first operational use of this new fine grid covering the north part of Tahiti that is not protected by a coral reef. So we were attentive to the feed back of the alert that confirm the forecast of the maximal height arrival 1 hour after the first arrival. The tsunami warning system forecast well strong impact as well as low impact as long as we have an early robust description of the seismic parameters and fine grids about 10 m spatial resolution to simulate tsunami impact. In January of 2016 we are able to forecast tsunami heights for 72 points located over 35 islands of French Polynesia.

  5. Pharmaceutical supply for disaster victims who need chronic disease management in aging region based on lessons learned from the Noto Peninsula Earthquake in 2007

    OpenAIRE

    奥村, 順子; 西田, 祥啓; 木村, 和子

    2008-01-01

    The lessons from the Great Hanshin-Awaji Earthquake and Chuuetsu Earthquake showed us how difficult it is to keep chronic disease management for survivors of such large-scale earthquakes, particularly for elderly people. To solve the problem, an ordinance for enforcement on exceptional practices was issued for the Pharmaceutical Affairs Law Article 49 Clause 1. The law allows selling prescription medicines for patients with chronic diseases who have difficulties to continue their medications ...

  6. From Colfiorito to L'Aquila Earthquake: learning from the past to communicating the risk of the present

    Science.gov (United States)

    Lanza, T.; Crescimbene, M.; La Longa, F.

    2012-04-01

    Italy is a country at risk of impending earthquake in the near future. Very probably, as it has already happened in the 13 years between the last two important seismic events (Colfiorito 1997- L'Aquila 2009), there won't be enough time to solve all the problems connected to seismic risk: first of all the corruption related to politics concerning buildings; the lack of the money necessary to strengthen the already existing ones, historical centres, monuments and the masterpieces of Art; the difficult relations of the Institutions with the traditional media (newspapers, radio and TV) and, at the same time, the new media (web); the difficulties for scientists to reach important results in the immediate future due to the lack of funding and, last but not least, to the conflicting relationships inside the scientific community itself. In this scenario, communication and education play a crucial role in minimizing the risk of the population. In the present work we reconsider the past with the intent of starting to trace a path for a future strategy of risk communication where everybody involved, included the population, should do his best in order to face the next emergency.

  7. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  8. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  9. Improved nuclear emergency management system reflecting lessons learned from the emergency response at Fukushima Daini Nuclear Power Station after the Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Kawamura, Shinichi; Narabayashi, Tadashi

    2016-01-01

    Three nuclear reactors at Fukushima Daini Nuclear Power Station lost all their ultimate heat sinks owing to damage from the tsunami caused by the Great East Japan Earthquake on March 11, 2011. Water was injected into the reactors by alternate measures, damaged cooling systems were restored with promptly supplied substitute materials, and all the reactors were brought to a cold shutdown state within four days. Lessons learned from this experience were identified to improve emergency management, especially in the areas of strategic response planning, logistics, and functions supporting response activities continuing over a long period. It was found that continuous planning activities reflecting information from plant parameters and response action results were important, and that relevant functions in emergency response organizations should be integrated. Logistics were handled successfully but many difficulties were experienced. Therefore, their functions should be clearly established and improved by emergency response organizations. Supporting emergency responders in the aspects of their physical and mental conditions was important for sustaining continuous response. As a platform for improvement, the concept of the Incident Command System was applied for the first time to a nuclear emergency management system, with specific improvement ideas such as a phased approach in response planning and common operation pictures. (author)

  10. The Influence of Table Top Technology in Full- service Restaurants

    OpenAIRE

    Susskind, Alex M.; Curry, Benjamin

    2016-01-01

    The use of tabletop technology continues to grow in the restaurant industry, and this study identifies the strengths and weakness of the technology, how it influences customers, and how it can improve the bottom line for managers and business owners. Results from two studies involving a full-service casual dining chain show that dining time was significantly reduced among patrons who used the tabletop hardware to order or pay for their meals, as was the time required for servers t...

  11. Table tops game:and the winner is...

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    The challenge to identify the events on the coffee tables in Restaurant 1 was clearly a tough one. Congratulations to the few brave entrants! Tuula Maki was the only person to correctly identify all 16 events on the coffee tables in Restaurant 1 and so receives the prize of a copy of the pop-up book, Voyage to the Heart of Matter. An LHC key-ring will go to runners-up Vincent Chabaud,  Eric Jansen, and Andre David Tinoco Mendes, with 14/16 correct.  

  12. Towards a table-top synchrotron based on supercontinuum generation

    DEFF Research Database (Denmark)

    Petersen, Christian Rosenberg; Moselund, Peter M.; Huot, Laurent

    2018-01-01

    Recently, high brightness and broadband supercontinuum (SC) sources reaching far into the infrared (IR) have emerged with the potential to rival traditional broadband sources of IR radiation. Here, the brightness of these IR SC sources is compared with that of synchrotron IR beamlines and SiC the...

  13. Earthquake friction

    Science.gov (United States)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  14. The Development of an Earthquake Mind Mapping

    Directory of Open Access Journals (Sweden)

    Sri Adelila Sari

    2016-05-01

    Full Text Available The students were difficult to understand about earthquake caused the teaching methods used by teachers were still using the classic method. The teachers only used a textbook to teach the students without any other supporting equipments. Learning process by using the discourse method makes students thinking monotonically, so that only concentrated on the students' understanding of the matter presented by the teacher. Therefore, this study was aimed to develop an earthquake mind mapping to help students in the process of remembering and recording the material being taught by the teacher. The type of this study was Research and Development (R & D. Data were analyzed using descriptive statistics. The samples in this study were class of I-3Madrasah Tsanawiyah (MTs Darul Ulum Banda Aceh totaling 30 students. The results showed that mind mapping was developed by 5 stages in ADDIE models: analysis (analyzing the problem and find a solution, design (determine the learning strategies, development (producing an earthquake mind mapping to be used in the learning process, implementation (implementing learning activities using the media and evaluation (evaluating the learning activities. When students instructed to create their mind mapping, it was found that the products of mind mapping categorized in skilled and quite skilled were amounted to 73.33 and 26.66% respectively. As recommendation an earthquake mind mapping could be applied and useful as an effective learning.

  15. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  16. Follow-up IAEA mission in relation to the findings and lessons learned from the 16 July 2007 earthquake at Kashiwazaki-Kariwa NPP - 'The Niigataken Chuetsu-oki earthquake', Tokyo and Kashiwazaki-Kariwa NPP, Japan, 28 January - 1 February 2008. Mission report. V. 1

    International Nuclear Information System (INIS)

    2008-01-01

    On 16 July 2007, a strong earthquake, the Niigataken Chuetsu-oki earthquake, with a moment magnitude of 6.6 (M JMA =6.8 according to the Japanese Meteorological Agency), occurred at 10:13 h local time with its hypocentre below the seabed of the Jo-chuetsu area in Niigata prefecture (37 deg. 33' N, 138 deg. 37'E) in Japan, affecting the Kashiwazaki-Kariwa Nuclear Power Plant (NPP) located approximately 16 km south of its epicentre. Kashiwazaki-Kariwa NPP is the biggest nuclear power plant site in the world. It is located in the Niigata prefecture, in the northwest coast of Japan, and it is operated by Tokyo Electric Power Company (TEPCO). The site has seven units with a total of 7965 MW net installed capacity. Five reactors are of BWR type and two reactors are of ABWR type. The five BWR units entered commercial operation between 1985 and 1994 and the two ABWRs in 1996 and 1997, respectively. Following this event, the Government of Japan through the Nuclear and Industrial Safety Agency (NISA) requested the IAEA to carry out a fact finding mission with the main purpose of identifying the preliminary findings and lessons learned from this event in order to share them with the international nuclear community. This first mission took place from 6 - 10 August 2007 and the mission report of the August 2007 mission is available on the IAEA web page http://www.iaea.org. The purpose of the second IAEA mission was to conduct - six months after the event - a follow-up of the preliminary findings of the August 2007 mission on the basis of the results available in January 2008 of the related studies and investigations performed. In accordance with the terms of reference for the follow-up mission and the availability of results from the performed studies and investigations, the scope of the follow-up mission focussed on three subject areas: (1) seismic design basis - design basis ground motions, including the evaluation of the seismic hazard ; (2) plant behaviour - integrity

  17. What should be learned from the 2011 off the Pacific coast of Tohoku Earthquake to formulate the standard seismic motion of a nuclear power plant

    International Nuclear Information System (INIS)

    Nozu, Atsushi

    2017-01-01

    Although inter plate earthquake is also taken into account in the formulation of standard seismic motion of a nuclear power plant, the lessons of the 2011 off the Pacific coast of Tohoku Earthquake are not fully utilized in the evaluation of the strong motion record. This paper discussed the following three points. (1) During the 2011 off the Pacific coast of Tohoku Earthquake, sharp pulses generated from a narrow area (SPGA: strong motion pulse generation area) on the fault plane determined the maximum amplitude of the earthquake ground motion at the nuclear power plant. (2) In order to accurately calculate this pulse, the current SMGA model is insufficient. (3) The seismic motion evaluated on the assumption that the SPGA is close to a nuclear power plant can exceed the assumption made by power companies. From the studies so far, it is clear that the current SMGA model cannot accurately calculate pulse waves with a time width of 1 to 2 seconds that is technically important, and its causes are clear. For this reason, the SMGA model is not suitable as a seismic source model for establishing the standard seismic motion of a nuclear power plant. It is necessary to have a model that can reproduce the earthquake ground motion due to the 2011 off the Pacific coast of Tohoku Earthquake, in an accuracy of SPGA model or higher. Pulse waves with a time width of 1 to 2 seconds that simultaneously bring about large accelerations and speeds are likely accompany plasticization and cause major damage. (A.O.)

  18. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  19. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  20. Operational earthquake forecasting can enhance earthquake preparedness

    Science.gov (United States)

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  1. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  2. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  3. Pharmaceutical supply for disaster victims who need chronic disease management in region with aging population based on lessons learned from the Noto Peninsula Earthquake in 2007.

    Science.gov (United States)

    Okumura, Junko; Nishita, Yoshihiro; Kimura, Kazuko

    2008-09-01

    The lessons from the Great Hanshin-Awaji Earthquake and Chuuetsu Earthquake showed us how difficult it is to keep chronic disease management for survivors of such large-scale earthquakes, particularly for elderly people. To solve the problem, an ordinance for enforcement on exceptional practices was issued for the Pharmaceutical Affairs Law Article 49 Clause 1. The law allows selling prescription medicines for patients with chronic diseases who have difficulties to continue their medications due to a large-scale disaster. To make it work, the patient should demonstrate that he or she continuously received the medication by presenting either Medication Notebook or prescription book recorded by the pharmacist. However, the Separation Rate of Prescription and Dispensing in Japan is still low; in particular, that in Ishikawa prefecture, where the Noto Peninsula Earthquake (M 6.9) occurred on March 25, 20007, is very low. It means that few victims hold a Medication Notebook. In consideration of this situation, we conducted a questionnaire survey of elderly victims of the Noto Peninsula Earthquake with a key-informant-interview during the period from July through August, 2007. This study revealed that: 1) Only 16% (18/110) of respondents kept a Medication Notebook; 2) 75% (82/110) had chronic diseases and received medication regularly; 3) Of 81 who had chronic diseases, 42% (34/91) were dispensed at the same pharmacy always, (The rest received from either clinic or changing pharmacy according to clinic location); and 4) Diseases that the respondents had were hypertension, cardiovascular diseases, diabetes, and so on. Based on these results, we discuss the establishment of a pharmaceutical supply system that can effectively distribute appropriate medicines to patients under difficult situations following a large-scale disaster in Japan.

  4. Preliminary findings and lessons learned from the 16 July 2007 earthquake at Kashiwazaki-Kariwa NPP - 'The Niigataken Chuetsu-Oki earthquake', Kashiwazaki-Kariwa NPP and Tokyo, Japan, 6-10 August 2007. Mission report. V. 2

    International Nuclear Information System (INIS)

    2007-01-01

    Upon request from the Government of Japan an IAEA expert mission was conducted at the Kashiwazaki-Kariwa NPP following a strong earthquake that affected the plant on 16 July 2007. Thus, the mission complemented the ongoing safety evaluations of the incident as they are currently being performed by Japan's Nuclear and Industrial Safety Agency, Japan's Nuclear Safety Commission and the plant operator, the Tokyo Electric Power Company. The scope of the mission was limited to three subject areas: Area 1: Seismic design basis - design basis ground motions Preliminary investigations of the actual earthquake and its ground motions and comparison with the design basis ground motions for the plant seismic design. Area 2: Plant behaviour - structures, systems and components Observation of the damage that occurred as a consequence of the earthquake of 16 July 2007 to the seven units at Kashiwazaki-Kariwa nuclear power plant site on the basis of the information gathered and made available by TEPCO and by performing limited but representative plant walkdowns. Area 3: Operational safety management Preliminary investigations of the operational safety management response and releases of radioactive material during and after the earthquake, on the basis of the examination of documents and of discussions with TEPCO. The mission report is composed of two volumes, Volume I and Volume II. This Volume II contains all supporting documentation and information collected during the mission and provided by the counterpart to the IAEA Expert Team. It is arranged in a way that it will be relatively easy for the reader to find the necessary information. There is a significant amount of information contained in Volume II that has come from different sources and that has been gathered for different purposes. The information has been compiled under headings that indicate its origin and purpose as well as their relationship to the observations and topics discussed in Volume I. First, a few

  5. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  6. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  7. Anesthesia department preparedness for a multiple-casualty incident: lessons learned from the Fukushima earthquake and the Japanese nuclear power disaster.

    Science.gov (United States)

    Murakawa, Masahiro

    2013-03-01

    In the Great East Japan Earthquake, which occurred on March 11, 2011, many lives were lost in the accompanying giant tsunami. Fukushima prefecture was widely contaminated with radioactive substances emitted by the accident at the nuclear power plant. Only a few trauma and emergency patients were brought to our hospital by ambulance, and an unexpectedly small number of emergency surgeries performed. There were patients with radiation-induced sickness and injury, but no cases of severe exposure requiring surgery or intensive care. As a logistic support hospital, we should prepare for and simulate these cases to respond to any such future occurrence. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  9. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    Department of Earth. Sciences, University of. Roorkee. Her interest is in computer based solutions to geophysical and other earth science problems. If we adopt the definition that an earthquake is shaking of the earth due to natural causes, then we may argue that earthquakes have been occurring since the very beginning.

  10. Bam Earthquake in Iran

    CERN Document Server

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  11. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  12. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    there are few estimates about this earthquake as it probably occurred in that early period of the earth's history about which astronomers, physicists, chemists and earth scientists are still sorting out their ideas. Yet, the notion of the earliest earthquake excites interest. We explore this theme here partly also because.

  13. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  14. Spatiotermporal correlations of earthquakes

    International Nuclear Information System (INIS)

    Farkas, J.; Kun, F.

    2007-01-01

    Complete text of publication follows. An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves. At the present technological level, earthquakes of magnitude larger than three can be recorded all over the world. In spite of the apparent randomness of earthquake occurrence, long term measurements have revealed interesting scaling laws of earthquake characteristics: the rate of aftershocks following major earthquakes has a power law decay (Omori law); the magnitude distribution of earthquakes exhibits a power law behavior (Gutenberg-Richter law), furthermore, it has recently been pointed out that epicenters form fractal networks in fault zones (Kagan law). The theoretical explanation of earthquakes is based on plate tectonics: the earth's crust has been broken into plates which slowly move under the action of the flowing magma. Neighboring plates touch each other along ridges (fault zones) where a large amount of energy is stored in deformation. Earthquakes occur when the stored energy exceeds a material dependent threshold value and gets released in a sudden jump of the plate. The Burridge-Knopoff (BK) model of earthquakes represents earth's crust as a coupled system of driven oscillators where nonlinearity occurs through a stick-slip frictional instability. Laboratory experiments have revealed that under a high pressure the friction of rock interfaces exhibits a weakening with increasing velocity. In the present project we extend recent theoretical studies of the BK model by taking into account a realistic velocity weakening friction force between tectonic plates. Varying the strength of weakening a broad spectrum of interesting phenomena is obtained: the model reproduces the Omori and Gutenberg-Richter laws of earthquakes, furthermore, it provides information on the correlation of earthquake sequences. We showed by computer simulations that the spatial and temporal correlations of consecutive earthquakes are very

  15. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  16. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  17. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  18. Earthquakes and emergence

    Science.gov (United States)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  19. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  20. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  1. Earthquake education in California

    Science.gov (United States)

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  2. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  3. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  4. Injection-induced earthquakes.

    Science.gov (United States)

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  5. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  6. Landslide-dammed lake at Tangjiashan, Sichuan province, China (triggered by the Wenchuan Earthquake, May 12, 2008): Risk assessment, mitigation strategy, and lessons learned

    Science.gov (United States)

    Cui, P.; Dang, C.; Zhuang, J.; You, Y.; Chen, X.; Scott, K.M.

    2012-01-01

    Landslides and rock avalanches triggered by the 2008 Wenchuan Earthquake produced 257 landslide dams, mainly situated along the eastern boundary of the Qinghai-Tibet Plateau where rivers descend approximately 3,000 m into the Sichuan Basin. The largest of these dams blocked the Tongkou River (a tributary of the Fujiang River) at Tangjiashan. The blockage, consisting of 2. 04 ?? 10 7 m 3 of landslide debris, impounded a lake with a projected maximum volume of 3. 15 ?? 10 8 m 3, potentially inundating 8. 92 km 2 of terrain. Its creation during the rainy season and the possibility of an uncontrolled release posed a serious, impending threat to at least 1. 3 million people downstream that could add substantially to the total of 69,200 individuals directly killed by the earthquake. Risk assessment of the blockage indicated that it was unlikely to collapse suddenly, and that eventual overtopping could be mitigated by notching the structure in order to create an engineered breach and achieve safe drainage of the lake. In addition to the installation of monitoring and warning instrumentation, for emergency planning we estimated several outburst scenarios equivalent to 20, 25, 33, and 50% of the dam failing suddenly, creating, respectively, 3. 35, 3. 84, 4. 22, and 4. 65 km 2 of flooded area, and overbank water depths of 4. 6, 5. 1, 5. 7, and 6. 2 m, respectively, in Mianyang, the second largest city in Sichuan Province, 48 km downstream from the blockage. Based on these scenarios, recommendations and plans for excavating a sluiceway, draining the lake, and downstream evacuation were proposed and later were implemented successfully, with the blockage breached by overtopping on June 10, less than a month after dam emplacement. The peak discharge of the release only slightly exceeded the flood of record at Mianyang City. No lives were lost, and significant property damage was avoided. Post-breaching evaluation reveals how future similar mitigation can be improved. Although

  7. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  8. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  9. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  10. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    Science.gov (United States)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  11. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  12. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  13. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  14. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  15. Exploring the Impact of Prior Knowledge and Appropriate Feedback on Students' Perceived Cognitive Load and Learning Outcomes: Animation-Based Earthquakes Instruction

    Science.gov (United States)

    Yeh, Ting-Kuang; Tseng, Kuan-Yun; Cho, Chung-Wen; Barufaldi, James P.; Lin, Mei-Shin; Chang, Chun-Yen

    2012-01-01

    The aim of this study was to develop an animation-based curriculum and to evaluate the effectiveness of animation-based instruction; the report involved the assessment of prior knowledge and the appropriate feedback approach, for the purpose of reducing perceived cognitive load and improving learning. The curriculum was comprised of five subunits…

  16. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  18. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  19. Seismic resistance of equipment and building service systems: review of earthquake damage design requirements, and research applications in the USA

    International Nuclear Information System (INIS)

    Skjei, R.E.; Chakravartula, B.C.; Yanev, P.I.

    1979-01-01

    The history of earthquake damage and the resulting code design requirements for earthquake hazard mitigation for equipment in the USA is reviewed. Earthquake damage to essential service systems is summarized; observations for the 1964 Alaska and the 1971 San Fernando, California, earthquakes are stressed, and information from other events is included. USA building codes that reflect lessons learned from these earthquakes are discussed; brief summaries of widely used codes are presented. In conclusion there is a discussion of the desirability of adapting advanced technological concepts from the nuclear industry to equipment in conventional structures. (author)

  20. PAGER--Rapid assessment of an earthquake?s impact

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  1. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  2. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  3. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  4. Homogeneous catalogs of earthquakes.

    Science.gov (United States)

    Knopoff, L; Gardner, J K

    1969-08-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967.

  5. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    Science.gov (United States)

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  6. Earthquake in Haiti

    DEFF Research Database (Denmark)

    Holm, Isak Winkel

    2012-01-01

    In the vocabulary of modern disaster research, Heinrich von Kleist's seminal short story "The Earthquake in Chile" from 1806 is a tale of disaster vulnerability. The story is not just about a natural disaster destroying the innocent city of Santiago but also about the ensuing social disaster...

  7. Earthquake-proof plants

    International Nuclear Information System (INIS)

    Francescutti, P.

    2008-01-01

    In the wake of the damage suffered by the Kashiwazaki-Kariwa nuclear power plant as a result of an earthquake last July, this article looks at the seismic risk affecting the Spanish plants and the safety measures in place to prevent it. (Author)

  8. Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  9. The HayWired earthquake scenario—Earthquake hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  10. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  11. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  12. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  13. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  14. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  15. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  16. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  17. Earthquake forecasting in Italy, before and after Umbria-Marche seismic sequence 1997. A review of the earthquake occurrence modeling at different spatio-temporal-magnitude scales.

    Directory of Open Access Journals (Sweden)

    W. Marzocchi

    2008-06-01

    Full Text Available The main goal of this work is to review the scientific researches carried out before and after the Umbria-Marche sequence related to the earthquake forecasting/prediction in Italy. In particular, I focus the attention on models that aim addressing three main practical questions: was (is Umbria-Marche a region with high probability of occurrence of a destructive earthquake? Was a precursory activity recorded before the mainshock(s? What was our capability to model the spatio-temporal-magnitude evolution of that seismic sequence? The models are reviewed pointing out what we have learned after the Umbria-Marche earthquakes, in terms of physical understanding of earthquake occurrence process, and of improving our capability to forecast earthquakes and to track in real-time seismic sequences.

  18. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  19. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  20. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  1. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  2. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  3. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Science.gov (United States)

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents

  4. Earthquake Forecasting System in Italy

    Science.gov (United States)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  5. Earthquake forecasting and its verification

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2005-01-01

    Full Text Available No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months. However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'' where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver operating characteristic (ROC diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.

  6. Researches on Application of GPS to Earthquake Monitoring and Prediction

    Directory of Open Access Journals (Sweden)

    Wanju BO

    2007-10-01

    Full Text Available The earliest researches on application of GPS to earthquake monitoring and prediction in China began in 1980s, and it was limited to learn some relative technology from other countries and do some test with a few of equipments. As the improvement of software for data processing and the depreciating of hardware, several local GPS network had been gradually set up till the end of 1990s, and then more systematically GPS monitoring, data processing and its application research have been done gradually. In this paper, 3 research examples of the application of GPS to earthquake monitoring and prediction are presented.

  7. UCERF3: A new earthquake forecast for California's complex fault system

    Science.gov (United States)

    Field, Edward H.; ,

    2015-01-01

    With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

  8. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  9. Earthquake Preparedness Checklist for Schools.

    Science.gov (United States)

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  10. Earthquake Loss Estimation Uncertainties

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  11. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  12. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  13. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  14. Early Earthquakes of the Americas

    Science.gov (United States)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  15. Are Earthquakes a Critical Phenomenon?

    Science.gov (United States)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  16. Lecture Demonstrations on Earthquakes for K-12 Teachers and Students

    Science.gov (United States)

    Dry, M. D.; Patterson, G. L.

    2005-12-01

    geophone, a touch-screen monitor, and various manipulatives. CERI is also developing suitcase kits and activities for teachers to borrow and use in their classrooms. The suitcase kits include activities based on state learning standards, such as layers of the Earth and plate tectonics. Items included in the suitcase modules include a shake table and dollhouse, an oscilloscope and geophone, a resonance model, a Slinky, Silly putty, Popsicle sticks, and other items. Almost all of the activities feature a lecture demonstration component. These projects would not be possible without leveraged funding from the Mid-America Earthquake Center (MAEC) and the Center for Earthquake Research and Information, with additional funding from the National Earthquake Hazards Reduction Program (NEHRP).

  17. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  18. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  19. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    Science.gov (United States)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

  20. The Alaska earthquake, March 27, 1964: lessons and conclusions

    Science.gov (United States)

    Eckel, Edwin B.

    1970-01-01

    subsidence was superimposed on regional tectonic subsidence to heighten the flooding damage. Ground and surface waters were measurably affected by the earthquake, not only in Alaska but throughout the world. Expectably, local geologic conditions largely controlled the extent of structural damage, whether caused directly by seismic vibrations or by secondary effects such as those just described. Intensity was greatest in areas underlain by thick saturated unconsolidated deposits, least on indurated bedrock or permanently frozen ground, and intermediate on coarse well-drained gravel, on morainal deposits, or on moderately indurated sedimentary rocks. Local and even regional geology also controlled the distribution and extent of the earthquake's effects on hydrologic systems. In the conterminous United States, for example, seiches in wells and bodies of surface water were controlled by geologic structures of regional dimension. Devastating as the earthquake was, it had many long-term beneficial effects. Many of these were socioeconomic or engineering in nature; others were of scientific value. Much new and corroborative basic geologic and hydrologic information was accumulated in the course of the earthquake studies, and many new or improved investigative techniques were developed. Chief among these, perhaps, were the recognition that lakes can be used as giant tiltmeters, the refinement of methods for measuring land-level changes by observing displacements of barnacles and other sessile organisms, and the relating of hydrology to seismology by worldwide study of hydroseisms in surface-water bodies and in wells. The geologic and hydrologic lessons learned from studies of the Alaska earthquake also lead directly to better definition of the research needed to further our understanding of earthquakes and of how to avoid or lessen the effects of future ones. Research is needed on the origins and mechanisms of earthquakes, on crustal structure, and on the generation of tsunamis and

  1. Is Your Class a Natural Disaster? It can be... The Real Time Earthquake Education (RTEE) System

    Science.gov (United States)

    Whitlock, J. S.; Furlong, K.

    2003-12-01

    In cooperation with the U.S. Geological Survey (USGS) and its National Earthquake Information Center (NEIC) in Golden, Colorado, we have implemented an autonomous version of the NEIC's real-time earthquake database management and earthquake alert system (Earthworm). This is the same system used professionally by the USGS in its earthquake response operations. Utilizing this system, Penn State University students participating in natural hazard classes receive real-time alerts of worldwide earthquake events on cell phones distributed to the class. The students are then responsible for reacting to actual earthquake events, in real-time, with the same data (or lack thereof) as earthquake professionals. The project was first implemented in Spring 2002, and although it had an initial high intrigue and "coolness" factor, the interest of the students waned with time. Through student feedback, we observed that scientific data presented on its own without an educational context does not foster student learning. In order to maximize the impact of real-time data and the accompanying e-media, the students need to become personally involved. Therefore, in collaboration with the Incorporated Research Institutes of Seismology (IRIS), we have begun to develop an online infrastructure that will help teachers and faculty effectively use real-time earthquake information. The Real-Time Earthquake Education (RTEE) website promotes student learning by integrating inquiry-based education modules with real-time earthquake data. The first module guides the students through an exploration of real-time and historic earthquake datasets to model the most important criteria for determining the potential impact of an earthquake. Having provided the students with content knowledge in the first module, the second module presents a more authentic, open-ended educational experience by setting up an earthquake role-play situation. Through the Earthworm system, we have the ability to "set off

  2. Evaluating the role of large earthquakes on aquifer dynamics using data fusion and knowledge discovery techniques

    Science.gov (United States)

    Friedel, Michael; Cox, Simon; Williams, Charles; Holden, Caroline

    2016-04-01

    Artificial adaptive systems are evaluated for their usefulness in modeling earthquake hydrology of the Canterbury region, NZ. For example, an unsupervised machine-learning technique, self-organizing map, is used to fuse about 200 disparate and sparse data variables (such as, well pressure response, ground acceleration, intensity, shaking, stress and strain; aquifer and well characteristics) associated with the M7.1 Darfield earthquake in 2010 and the M6.3 Christchurch earthquake in 2011. The strength of correlations, determined using cross-component plots, varied between earthquakes with pressure changes more strongly related to dynamic- than static stress-related variables during the M7.1 earthquake, and vice versa during the M6.3. The method highlights the importance of data distribution and that driving mechanisms of earthquake-induced pressure change in the aquifers are not straight forward to interpret. In many cases, data mining revealed that confusion and reduction in correlations are associated with multiple trends in the same plot: one for confined and one for unconfined earthquake response. The autocontractive map and minimum spanning tree techniques are used for grouping variables of similar influence on earthquake hydrology. K-means clustering of neural information identified 5 primary regions influenced by the two earthquakes. The application of genetic doping to a genetic algorithm is used for identifying optimal subsets of variables in formulating predictions of well pressures. Predictions of well pressure changes are compared and contrasted using machine-learning network and symbolic regression models with prediction uncertainty quantified using a leave-one-out cross-validation strategy. These preliminary results provide impetus for subsequent analysis with information from another 100 earthquakes that occurred across the South Island.

  3. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  4. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  5. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  6. Designing plants to withstand earthquakes

    International Nuclear Information System (INIS)

    Nedderman, J.

    1995-01-01

    The approach used in Japan to design nuclear plants capable of withstanding earthquakes is described. Earthquakes are classified into two types S 1 and S 2 . In an S 1 earthquake a nuclear plant must be capable of surviving essentially undamaged. In the more severe S 2 earthquake, some damage may occur but there should be no release of radioactivity to the outside world. The starting point for the designer is the ground response spectrum of the earthquake which shows both the ground acceleration and the frequencies of the vibrations. From the ground response spectra synthetic seismic waves for S 1 and S 2 earthquakes are developed which can then be used to analyse a ''lumped-mass'' model of the reactor building to arrive at floor response spectra. These spectra are then used in further analyses of the design of reactor equipment, piping systems and instrument racks and supports. When a plant is constructed, results from tests with a vibration exciter are used to verify the floor response spectra and principle building resonances. Much of the equipment can be tested on vibrating tables. One large table with a maximum loading capacity of 1000 t is used to test large-scale models of containment vessels, pressure vessels and steam generators. Such tests have shown that the plants have considerable safety margins in their ability to withstand the design basis earthquakes. (UK)

  7. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  8. Fracking, wastewater disposal, and earthquakes

    Science.gov (United States)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  9. Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks

    Science.gov (United States)

    Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital

    2015-01-01

    This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.

  10. Teaching through 10,000 Earthquakes: Constructive Practice for Instructors in a Post-Disaster Environment

    Science.gov (United States)

    Wright, Sarah; Wordsworth, Russell

    2013-01-01

    The authors describe their experiences of teaching through a series of major earthquakes and the lessons learned regarding sustaining teaching and learning through an ongoing natural disaster. Student feedback data from across the university is analyzed to generate a model of constructive practice for instructors responding to a crisis. The…

  11. Earthquakes, July-August 1992

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤Mearthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  12. Earthquake Zoning Maps of Turkey

    International Nuclear Information System (INIS)

    Pampal, S.

    2007-01-01

    Earthquake Zoning Maps (1945, 1947, 1963, 1972 and 1996) and Specifications for Construction in Disaster Areas (1947, 1953, 1962, 1968, 1975, 1996, 1997 and 2006) have been changed many times following the developments in engineering seismology, tectonic and seismo-tectonic invention and improved earthquake data collection. The aim of this study is to give information about this maps, which come into force at different dates since the introduction of the firs official Earthquake Zoning Map published in 1945 and is to assist for better understanding of the development phases of these maps

  13. Seismology: dynamic triggering of earthquakes.

    Science.gov (United States)

    Gomberg, Joan; Johnson, Paul

    2005-10-06

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  14. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  15. Role of the professional helper in disaster intervention: examples from the Wenchuan Earthquake in China.

    Science.gov (United States)

    Wang, Xiying; Lum, Terry Y

    2013-01-01

    This article highlights the different roles that social workers played in disaster intervention after the Wenchuan earthquake. Using 3 stages (i.e., rescue, temporary relocation, and reconstruction) as a time framework, we describe social workers' roles, their performance, and the achievements and challenges they faced while providing service to the people and communities affected by the earthquake. Moreover, we draw conclusions on best practices and lessons learned, and make recommendations for future practices and research.

  16. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  18. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  19. Haiti Earthquake: Crisis and Response

    Science.gov (United States)

    2010-02-19

    years ago, in 1860. Haitian ministries are addressing issues such as long-term housing for those left homeless by the earthquake as they operate out...CRS Report for Congress Prepared for Members and Committees of Congress Haiti Earthquake: Crisis and Response Rhoda Margesson... Crisis and Response 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK

  20. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  1. The great East Japan earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Fluke, R.

    2011-06-15

    'Full text:' More formally called the Tohoku-Chihou-Taiheiyo-Oki Earthquake of March 11, 2011, it was the ensuing tsunami that caused the most death and destruction to the north-east coastal region of Japan. It is also what caused the multiple meltdowns at Fukushima Dai-ichi. Reactor Unit 1, ironically, was scheduled to be permanently shut down for decommissioning just two weeks later. The Fukushima Dai-ichi nuclear power plant has a tsunami protection barrier designed for the worst recorded tsunami in that area since 1896 - to a height of 5.7 m. The plant itself is on an elevated grade of about 10 m. The tsunami, reported to be 14-15 m, caused inundation of the entire site with at least four metres of seawater. The seawater flooded the turbine building and damaged electrical equipment including the emergency diesel generators, leaving the entire six-unit nuclear power plan without any source of AC power, known as the 'station blackout scenario'. There are numerous reports available on-line at various sites. The Japanese Government report is frank and forthcoming on the causes and the lessons learned, and the lAEA Mission report is in-depth and well presented, not only as a factual account of the events but as a unified source of the conclusions and lessons learned. Photos of the catastrophe are available at the TEPCO web site: http://www.tepco.co.jp/en/index-e.html. In this edition of the Bulletin there is a 'layman's' description of CANDU and BWR design in terms of the fundamental safety principles - Control, Cool and Contain as well as a description of how these principles were met, or not met at Fukushima Dai-ichi. Also, an excerpt from the IAEA Expert Mission is included. We 'technocrats' sometimes forget about the human aspects of a nuclear disaster. An essay by Dr. Michael Edwards is included entitled 'Psychology, Philosophy and Nuclear Science'. Other references to the events appear throughout this

  2. The great East Japan earthquake

    International Nuclear Information System (INIS)

    Fluke, R.

    2011-01-01

    'Full text:' More formally called the Tohoku-Chihou-Taiheiyo-Oki Earthquake of March 11, 2011, it was the ensuing tsunami that caused the most death and destruction to the north-east coastal region of Japan. It is also what caused the multiple meltdowns at Fukushima Dai-ichi. Reactor Unit 1, ironically, was scheduled to be permanently shut down for decommissioning just two weeks later. The Fukushima Dai-ichi nuclear power plant has a tsunami protection barrier designed for the worst recorded tsunami in that area since 1896 - to a height of 5.7 m. The plant itself is on an elevated grade of about 10 m. The tsunami, reported to be 14-15 m, caused inundation of the entire site with at least four metres of seawater. The seawater flooded the turbine building and damaged electrical equipment including the emergency diesel generators, leaving the entire six-unit nuclear power plan without any source of AC power, known as the 'station blackout scenario'. There are numerous reports available on-line at various sites. The Japanese Government report is frank and forthcoming on the causes and the lessons learned, and the lAEA Mission report is in-depth and well presented, not only as a factual account of the events but as a unified source of the conclusions and lessons learned. Photos of the catastrophe are available at the TEPCO web site: http://www.tepco.co.jp/en/index-e.html. In this edition of the Bulletin there is a 'layman's' description of CANDU and BWR design in terms of the fundamental safety principles - Control, Cool and Contain as well as a description of how these principles were met, or not met at Fukushima Dai-ichi. Also, an excerpt from the IAEA Expert Mission is included. We 'technocrats' sometimes forget about the human aspects of a nuclear disaster. An essay by Dr. Michael Edwards is included entitled 'Psychology, Philosophy and Nuclear Science'. Other references to the events appear throughout this edition.(author)

  3. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  4. Earthquake damage to underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    Pratt, H.R.; Hustrulid, W.A. Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository.

  5. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  6. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  7. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    Science.gov (United States)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and

  8. Beam properties of fully optimized, table-top, coherent source at 30 nm

    Czech Academy of Sciences Publication Activity Database

    Jakubczak, Krzysztof; Mocek, Tomáš; Rus, Bedřich; Polan, Jiří; Hřebíček, Jan; Sawicka, Magdalena; Sikocinski, Pawel; Sobota, Jaroslav; Fořt, Tomáš; Pína, L.

    2011-01-01

    Roč. 19, č. 2 (2011), s. 169-175 ISSN 1230-3402 R&D Projects: GA AV ČR KAN300100702; GA MŠk(CZ) LC528; GA ČR GC202/07/J008 Grant - others:AV ČR(CZ) M100100911 Institutional research plan: CEZ:AV0Z10100523; CEZ:AV0Z20650511 Keywords : laser applications * high−order harmonic generation * coherent extreme ultraviolet radiation * ultrafast optics Subject RIV: BH - Optics, Masers, Lasers Impact factor: 0.966, year: 2011 http://www.springerlink.com/content/y0057067wvw03234/

  9. Table top exercise: State Response to a terrorist attack against an NPP. STAR Contract

    International Nuclear Information System (INIS)

    Nannini, A.; Aurelle, J.

    2012-01-01

    The response to a severe attack on a NPP encompasses protection of public and environment, maintaining public order and protection. To this end, the large number of local and national entities involved in the response will have to cooperate efficiently (security and safety authorities, operator teams, dedicated response forces, judicial authorities). The objective of STAR exercise:to develop a cultural integration of the implied agencies namely a mutual understanding between the safety and security stakes through national exercises,and to share with our European counterparts common modes for emergency management of a nuclear crisis. STAR is a scenario-driven case study, time-stepped facilitated discussion to address crisis decision management. The scenario considers an attack against a NPP requesting an emergency response at national level. Although NPPs are designed to sustain such attacks, the emergency preparedness and response management has to be prepared. The scenario provides successive failures of safety functions requiring timely and appropriate measures to be taken to stop the aggression and to restore safety. The objective is to identify and develop key issues related to the effectiveness of the response through a facilitated discussion. The scenario is organised by time-steps and used by facilitators to lead participants through the case study to express theirs comments, points of view, criticisms. The final discussion allows identifying good practices and recommendations. The first STAR session showed the importance of human factor and organizational issues such as: the coordination between the involved agencies, the need for a common language, the need to simplify the decision channels, the management of contradictory orders, or the management of available skills. The article is followed by the slides of the presentation

  10. Effectiveness of table top water pitcher filters to remove arsenic from drinking water.

    Science.gov (United States)

    Barnaby, Roxanna; Liefeld, Amanda; Jackson, Brian P; Hampton, Thomas H; Stanton, Bruce A

    2017-10-01

    Arsenic contamination of drinking water is a serious threat to the health of hundreds of millions of people worldwide. In the United States ~3 million individuals drink well water that contains arsenic levels above the Environmental Protection Agency (EPA) maximum contaminant level (MCL) of 10μg/L. Several technologies are available to remove arsenic from well water including anion exchange, adsorptive media and reverse osmosis. In addition, bottled water is an alternative to drinking well water contaminated with arsenic. However, there are several drawbacks associated with these approaches including relatively high cost and, in the case of bottled water, the generation of plastic waste. In this study, we tested the ability of five tabletop water pitcher filters to remove arsenic from drinking water. We report that only one tabletop water pitcher filter tested, ZeroWater®, reduced the arsenic concentration, both As 3+ and As 5+ , from 1000μg/L to water and its use reduces plastic waste associated with bottled water. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. On wave-driven "shingle" beach dynamics in a table-top Hele-Shaw cell

    NARCIS (Netherlands)

    Bokhove, Onno; van der Horn, Avraham/Bram; van der Horn, A.J.; van der Meer, Roger M.; Thornton, Anthony Richard; Zweers, W.; Lynett, P.J.

    2014-01-01

    The primary evolution of beaches by wave action takes place during storms. Beach evolution by non-linear breaking"waves is 3D, multi-scale, and involves particle-wave interactions. We will show how a novel, three-phase extension to the classic "Hele-Shaw" laboratory experiment is designed to create

  12. Table-top femtosecond soft X-ray laser by collisional ionization gating

    Czech Academy of Sciences Publication Activity Database

    Depresseux, A.; Oliva, E.; Gautier, J.; Tissandier, F.; Nejdl, Jaroslav; Kozlová, Michaela; Maynard, G.; Goddet, J.P.; Tafzi, A.; Lifschitz, A.; Kim, H. T.; Jacquemot, S.; Malka, V.; Phuoc, K.T.; Thaury, C.; Rousseau, P.; Iaquaniello, G.; Lefrou, T.; Flacco, A.; Vodungbo, B.; Lambert, G.; Rousse, A.; Zeitoun, P.; Sebban, S.

    2015-01-01

    Roč. 9, č. 12 (2015), s. 817-821 ISSN 1749-4885 R&D Projects: GA MŠk ED1.1.00/02.0061; GA MŠk EE2.3.20.0279 Grant - others:ELI Beamlines(XE) CZ.1.05/1.1.00/02.0061; LaserZdroj (OP VK 3)(XE) CZ.1.07/2.3.00/20.0279 Institutional support: RVO:68378271 Keywords : ultrafast photonics Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 31.167, year: 2015

  13. High current table-top setup for femtosecond gas electron diffraction

    Directory of Open Access Journals (Sweden)

    Omid Zandi

    2017-07-01

    Full Text Available We have constructed an experimental setup for gas phase electron diffraction with femtosecond resolution and a high average beam current. While gas electron diffraction has been successful at determining molecular structures, it has been a challenge to reach femtosecond resolution while maintaining sufficient beam current to retrieve structures with high spatial resolution. The main challenges are the Coulomb force that leads to broadening of the electron pulses and the temporal blurring that results from the velocity mismatch between the laser and electron pulses as they traverse the sample. We present here a device that uses pulse compression to overcome the Coulomb broadening and deliver femtosecond electron pulses on a gas target. The velocity mismatch can be compensated using laser pulses with a tilted intensity front to excite the sample. The temporal resolution of the setup was determined with a streak camera to be better than 400 fs for pulses with up to half a million electrons and a kinetic energy of 90 keV. The high charge per pulse, combined with a repetition rate of 5 kHz, results in an average beam current that is between one and two orders of magnitude higher than previously demonstrated.

  14. Electron radiography using a table-top laser-cluster plasma accelerator

    Czech Academy of Sciences Publication Activity Database

    Bussolino, G.C.; Faenov, A.; Giulietti, A.; Giulietti, D.; Koester, P.; Labate, L.; Levato, Tadzio; Pikuz, T.; Gizzi, L.A.

    2013-01-01

    Roč. 46, č. 24 (2013), "245501-1"-"245501-8" ISSN 0022-3727 Institutional support: RVO:68378271 Keywords : radiochromic film * beams * radiation * pulses Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.521, year: 2013

  15. Demonstration Experiments for Solid-State Physics Using a Table-Top Mechanical Stirling Refrigerator

    Science.gov (United States)

    Osorio, M. R.; Morales, A. Palacio; Rodrigo, J. G.; Suderow, H.; Vieira, S.

    2012-01-01

    Liquid-free cryogenic devices are acquiring importance in basic science and engineering. But they can also lead to improvements in teaching low temperature and solid-state physics to graduate students and specialists. Most of the devices are relatively expensive, but small-sized equipment is slowly becoming available. Here, we have designed…

  16. Process Intensification. Continuous Two-Phase Catalytic Reactions in a Table-Top Centrifugal Contact Separator

    NARCIS (Netherlands)

    Kraai, Gerard N.; Schuur, Boelo; van Zwol, Floris; Haak, Robert M.; Minnaard, Adriaan J.; Feringa, Ben L.; Heeres, Hero J.; de Vries, Johannes G.; Prunier, ML

    2009-01-01

    Production of fine chemicals is mostly performed in batch reactors. Use of continuous processes has many advantages which may reduce the cost of production. We have developed the use of centrifugal contact separators (CCSs) for continuous two-phase catalytic reactions. This equipment has previously

  17. A table-top PXI based low-field spectrometer for solution dynamic nuclear polarization.

    Science.gov (United States)

    Biller, Joshua R; Stupic, Karl F; Moreland, J

    2018-03-01

    We present the development of a portable dynamic nuclear polarization (DNP) instrument based on the PCI eXtensions for Instrumentation platform. The main purpose of the instrument is for study of 1 H polarization enhancements in solution through the Overhauser mechanism at low magnetic fields. A DNP probe set was constructed for use at 6.7 mT, using a modified Alderman-Grant resonator at 241 MHz for saturation of the electron transition. The solenoid for detection of the enhanced 1 H signal at 288 kHz was constructed with Litz wire. The largest observed 1 H enhancements (ε) at 6.7 mT for 14 N-CTPO radical in air saturated aqueous solution was ε~65. A concentration dependence of the enhancement is observed, with maximum ε at 5.5 mM. A low resonator efficiency for saturation of the electron paramagnetic resonance transition results in a decrease in ε for the 10.3 mM sample. At high incident powers (42 W) and long pump times, capacitor heating effects can also decrease the enhancement. The core unit and program described here could be easily adopted for multi-frequency DNP work, depending on available main magnets and selection of the "plug and play" arbitrary waveform generator, digitizer, and radiofrequency synthesizer PCI eXtensions for Instrumentatione cards. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  18. NATO CBRN Medical Working Group Table Top Exercise on International Health Regulations: Documentation and Output

    Science.gov (United States)

    2014-05-01

    poliomyelitis due to wild-type poliovirus , human influenza caused by a new subtype, severe acute respiratory syndrome, cholera, pneumonic plague, yellow fever...Figure 5 and Figure 6, with this assumption in place, there was again a growing consensus around a single response—report to the JFC Medical Advisor

  19. A mm-Wave, Table Top Cerenkov Free-Electron Laser

    CERN Document Server

    De la Fuente, Isabel; Van der Slot, Peter

    2004-01-01

    We have designed and constructed a compact (0.5 x 1.5 m), 100 kV Cerenkov FEL operating at a frequency of 50 GHz. The electron beam is produced by a gridded thermionic electron gun with a beam current of 800 mA. Simulations shows that 800 mA is sufficient to produce an output power of ~ 1 kW peak at 50 GHz using a total cavity reflectivity of about 10 to 20 %. The average power approaches 1 kW when the electron pulse length is extended to CW. A depressed collector will be used to increase the overall efficiency of this device. Special attention has been given to the outcoupler that has to combine multiple functions. First it has to separate the radiation field from the electron beam. Second it has to be transparent for the electron beam and acts as a partial reflector for radiation. Finally it has to convert the generated TM01 mode in the interaction region into the fundamental TE01 mode of the standard rectangular output port. We will present the overall design and experimental set-up, first experimental res...

  20. Towards a Table-Top Laser Driven XUV/X-Ray Source

    Science.gov (United States)

    2015-08-27

    surface  of  the   wires  are  injected  into  the   laser  pulse  and...phase,   laser -­‐irradiated  micro-­‐engineered   Si  micro-­‐ wire   arrays  were   investigated.   An   order   of   magnitude...summarize  how  the   mechanism  works:   hot  electrons  generated  from  the   laser -­‐plasma  interaction  near

  1. Space Laboratory on a Table Top: A Next Generative ECLSS design and diagnostic tool

    Science.gov (United States)

    Ramachandran, N.

    2005-01-01

    This paper describes the development plan for a comprehensive research and diagnostic tool for aspects of advanced life support systems in space-based laboratories. Specifically it aims to build a high fidelity tabletop model that can be used for the purpose of risk mitigation, failure mode analysis, contamination tracking, and testing reliability. We envision a comprehensive approach involving experimental work coupled with numerical simulation to develop this diagnostic tool. It envisions a 10% scale transparent model of a space platform such as the International Space Station that operates with water or a specific matched index of refraction liquid as the working fluid. This allows the scaling of a 10 ft x 10 ft x 10 ft room with air flow to 1 ft x 1 ft x 1 ft tabletop model with water/liquid flow. Dynamic similitude for this length scale dictates model velocities to be 67% of full-scale and thereby the time scale of the model to represent 15% of the full- scale system; meaning identical processes in the model are completed in 15% of the full- scale-time. The use of an index matching fluid (fluid that matches the refractive index of cast acrylic, the model material) allows making the entire model (with complex internal geometry) transparent and hence conducive to non-intrusive optical diagnostics. So using such a system one can test environment control parameters such as core flows (axial flows), cross flows (from registers and diffusers), potential problem areas such as flow short circuits, inadequate oxygen content, build up of other gases beyond desirable levels, test mixing processes within the system at local nodes or compartments and assess the overall system performance. The system allows quantitative measurements of contaminants introduced in the system and allows testing and optimizing the tracking process and removal of contaminants. The envisaged system will be modular and hence flexible for quick configuration change and subsequent testing. The data and inferences from the tests will allow for improvements in the development and design of next generation life support systems and configurations. Preliminary experimental and modeling work in this area will be presented. This involves testing of a single inlet-exit model with detailed 3-D flow visualization and quantitative diagnostics and computational modeling of the system.

  2. Ultrafast table-top dynamic radiography of spontaneous or stimulated events

    Science.gov (United States)

    Smilowitz, Laura; Henson, Bryan

    2018-01-16

    Disclosed herein are representative embodiments of methods, apparatus, and systems for performing radiography. For example, certain embodiments concern X-ray radiography of spontaneous events. Particular embodiments of the disclosed technology provide continuous high-speed x-ray imaging of spontaneous dynamic events, such as explosions, reaction-front propagation, and even material failure. Further, in certain embodiments, x-ray activation and data collection activation are triggered by the object itself that is under observation (e.g., triggered by a change of state detected by one or more sensors monitoring the object itself).

  3. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  4. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  5. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  6. Earthquake-induced landslides from horseback surveys through GIS analyses (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Keefer, David K.

    2010-05-01

    /hr. Accurately characterizing earthquake-induced landslides thus involves documenting many parameters during immediate post-earthquake investigations and then developing analyses to treat a wide variety of mechanisms and conditions. The generation of research up to the present has greatly increased our understanding of earthquake-induced landslides. On a regional scale we now have developed general relations between the magnitude of the triggering earthquake and several measures of landslide abundance and distribution, including numbers of landslides, areas affected by landslide occurrence, and maximum distances of landslides from the earthquake epicenters and fault ruptures. We have similar general relations between landslide occurrence and seismic intensities. We also have several empirical and analytical methods to forecast where landslides are most likely to occur in future earthquake scenarios. On the scale of individual landslides, we know much about the types of landslides specifically triggered by earthquakes and the types of slopes that produce each. We have analytical models of the main failure mechanisms, and can carry out analyses to determine whether particular slopes are likely to fail given specified future earthquake shaking. However, we still have much to learn. New remote sensing capabilities enable us to map the landslides triggered by an earthquake more completely, and GIS analyses enable us to develop much more detailed and specific relations between seismic and geologic parameters, on the one hand, and landslide occurrence on the other. Additional development and application in these areas, along with the continuing development of analytical techniques to characterize initiation and, especially, movement of landslides can be expected in the next generation of research. Such research should lead to a more detailed and specific understanding of where earthquake-induced landslides will occur and what their characteristics will be. Ultimately, this research

  7. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  8. Earthquake fault superhighways

    Science.gov (United States)

    Robinson, D. P.; Das, S.; Searle, M. P.

    2010-10-01

    Motivated by the observation that the rare earthquakes which propagated for significant distances at supershear speeds occurred on very long straight segments of faults, we examine every known major active strike-slip fault system on land worldwide and identify those with long (> 100 km) straight portions capable not only of sustained supershear rupture speeds but having the potential to reach compressional wave speeds over significant distances, and call them "fault superhighways". The criteria used for identifying these are discussed. These superhighways include portions of the 1000 km long Red River fault in China and Vietnam passing through Hanoi, the 1050 km long San Andreas fault in California passing close to Los Angeles, Santa Barbara and San Francisco, the 1100 km long Chaman fault system in Pakistan north of Karachi, the 700 km long Sagaing fault connecting the first and second cities of Burma, Rangoon and Mandalay, the 1600 km Great Sumatra fault, and the 1000 km Dead Sea fault. Of the 11 faults so classified, nine are in Asia and two in North America, with seven located near areas of very dense populations. Based on the current population distribution within 50 km of each fault superhighway, we find that more than 60 million people today have increased seismic hazards due to them.

  9. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  10. Post-earthquake action plan for existing NPP

    International Nuclear Information System (INIS)

    Chandra Basu, Prabir; )

    2011-01-01

    In recent years, a number of nuclear power plants (NPP), mainly in Japan, have been affected by strong earthquakes. In some cases, the measured ground motions have exceeded the design or evaluation bases. There were cases of such earthquake experience in other countries also. The experience from these events shows that operating plants were shut down immediately following the event. In most cases, no significant damage was identified in these nuclear power plant units. Plants remained shut down for extended periods for studies, investigations and evaluations to assess their safety. In a limited number of cases, upgrades were implemented to meet a) New definitions of the design basis, or b) Requirements for beyond design basis earthquakes. The experience of the above events demonstrated the need for formulating the Post-Earthquake Action Plan (PEqAP) covering specific and detailed criteria and procedures for addressing situations where the original seismic design or evaluation bases are exceeded by actual seismic events. In response to this need, IAEA had published, Earthquake Preparedness and Response for Nuclear Power Plants, Safety Report Series No. 66 (SR 66). The seismic safety knowledge and experience of different countries from the earthquakes till 2010 were collected and disseminated through this Safety Report for providing updated guidance for the actions to be taken in preparation for, and following, a felt earthquake at NPPs. The present article highlighted salient features of the SR66. The Great East Japan Earthquake, of magnitude 9.0, on 11 March 2011 generated extreme ground motion and a large tsunami that struck the East Coast of Japan. Several nuclear power plants (NPP) at Tokai Dai-ini, Higashi Dori, Onagawa, and TEPCO's Fukushima Dai-ichi and Dai-ini, were affected. The operating units at these sites were successfully shutdown. However, the large tsunami waves that followed the ground motion challenged the safety systems of four units of Fukushima

  11. Indonesian earthquake: earthquake risk from co-seismic stress.

    Science.gov (United States)

    McCloskey, John; Nalbant, Suleyman S; Steacy, Sandy

    2005-03-17

    Following the massive loss of life caused by the Sumatra-Andaman earthquake in Indonesia and its tsunami, the possibility of a triggered earthquake on the contiguous Sunda trench subduction zone is a real concern. We have calculated the distributions of co-seismic stress on this zone, as well as on the neighbouring, vertical strike-slip Sumatra fault, and find an increase in stress on both structures that significantly boosts the already considerable earthquake hazard posed by them. In particular, the increased potential for a large subduction-zone event in this region, with the concomitant risk of another tsunami, makes the need for a tsunami warning system in the Indian Ocean all the more urgent.

  12. Using earthquake intensities to forecast earthquake occurrence times

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2006-01-01

    Full Text Available It is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world.

  13. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  14. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    Science.gov (United States)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  15. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  16. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  17. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-Peak Ground Acceleration is a 2.5 by 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  18. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 by 2.5 minute global utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  19. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  20. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  1. Global Earthquake Proportional Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Proportional Economic Loss Risk Deciles is a 2.5 minute grid of earthquake hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  2. Global Earthquake Total Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Total Economic Loss Risk Deciles is a 2.5 minute grid of global earthquake total economic loss risks. A process of spatially allocating Gross...

  3. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Mortality Risks and Distribution is a 2.5 minute grid of global earthquake mortality risks. Gridded Population of the World, Version 3 (GPWv3) data...

  4. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  5. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-peak ground acceleration is a 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  6. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  7. The HayWired earthquake scenario—We can outsmart disaster

    Science.gov (United States)

    Hudnut, Kenneth W.; Wein, Anne M.; Cox, Dale A.; Porter, Keith A.; Johnson, Laurie A.; Perry, Suzanne C.; Bruce, Jennifer L.; LaPointe, Drew

    2018-04-18

    The HayWired earthquake scenario, led by the U.S. Geological Survey (USGS), anticipates the impacts of a hypothetical magnitude-7.0 earthquake on the Hayward Fault. The fault is along the east side of California’s San Francisco Bay and is among the most active and dangerous in the United States, because it runs through a densely urbanized and interconnected region. One way to learn about a large earthquake without experiencing it is to conduct a scientifically realistic scenario. The USGS and its partners in the HayWired Coalition and the HayWired Campaign are working to energize residents and businesses to engage in ongoing and new efforts to prepare the region for such a future earthquake.

  8. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  9. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  10. Associating an ionospheric parameter with major earthquake ...

    Indian Academy of Sciences (India)

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms > 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake ...

  11. Earthquakes: A Teacher's Package for K-6.

    Science.gov (United States)

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  12. Earthquakes in the New Zealand Region.

    Science.gov (United States)

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  13. Earthquakes in Zimbabwe | Clark | Zimbabwe Science News

    African Journals Online (AJOL)

    Earthquakes are one of the most destructive natural forces, in both human and economic terms. For example, since 1900, 10 earthquakes have occurred that each killed over 50 000 people. Earthquakes in modern industrialized areas can be also be very costly, even if well designed and constructed buildings save many ...

  14. Can Dams and Reservoirs Cause Earthquakes?

    Indian Academy of Sciences (India)

    indirect investigations of these regions are subject to inevitable multiple interpretations. Still, a measure of understanding about reservoir induced earthquakes has been achieved. It is my aim to put the phenomenon in a perspective on this basis. I saw the Koyna Earthquake Recorded. Koyna earthquake of December 10, ...

  15. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  16. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  17. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  18. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  19. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  20. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  1. Testing hypotheses of earthquake occurrence

    Science.gov (United States)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  2. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  3. A critical history of British earthquakes

    OpenAIRE

    R. M. W. Musson

    2004-01-01

    This paper reviews the history of the study of historical British earthquakes. The publication of compendia of British earthquakes goes back as early as the late 16th Century. A boost to the study of earthquakes in Britain was given in the mid 18th Century as a result of two events occurring in London in 1750 (analogous to the general increase in earthquakes in Europe five years later after the 1755 Lisbon earthquake). The 19th Century saw a number of significant studies, culminating in th...

  4. Predecessors of the giant 1960 Chile earthquake.

    Science.gov (United States)

    Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

    2005-09-15

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended.

  5. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  6. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  7. Control rod behaviour in earthquakes

    International Nuclear Information System (INIS)

    Kawakami, S.; Akiyama, H.; Shibata, H.; Watabe, M.; Ichikawa, T.; Fujita, K.

    1990-01-01

    For some years the Japanese have been working on a major research programme to determine the likely effects of an earthquake on nuclear plant internals. One aspect of this was a study of the behaviour of Pressurized Water Reactor control rods as they are being inserted in the core, which is reported here. (author)

  8. Building Resilient Mountain Communities: Earthquake ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2015-04-25

    A powerful 7.8 Richter magnitude earthquake hit central Nepal on April 25, 2015, causing over 8,700 deaths and more than 22,000 injuries. Hundreds of thousands of homes were flattened, some 15,000 government buildings and 288,797 residential buildings were destroyed, and some 8,000 schools and 1,000 health ...

  9. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  10. Earthquake swarms in South America

    Science.gov (United States)

    Holtkamp, S. G.; Pritchard, M. E.; Lohman, R. B.

    2011-10-01

    We searched for earthquake swarms in South America between 1973 and 2009 using the global Preliminary Determination of Epicenters (PDE) catalogue. Seismicity rates vary greatly over the South American continent, so we employ a manual search approach that aims to be insensitive to spatial and temporal scales or to the number of earthquakes in a potential swarm. We identify 29 possible swarms involving 5-180 earthquakes each (with total swarm moment magnitudes between 4.7 and 6.9) within a range of tectonic and volcanic locations. Some of the earthquake swarms on the subduction megathrust occur as foreshocks and delineate the limits of main shock rupture propagation for large earthquakes, including the 2010 Mw 8.8 Maule, Chile and 2007 Mw 8.1 Pisco, Peru earthquakes. Also, subduction megathrust swarms commonly occur at the location of subduction of aseismic ridges, including areas of long-standing seismic gaps in Peru and Ecuador. The magnitude-frequency relationship of swarms we observe appears to agree with previously determined magnitude-frequency scaling for swarms in Japan. We examine geodetic data covering five of the swarms to search for an aseismic component. Only two of these swarms (at Copiapó, Chile, in 2006 and near Ticsani Volcano, Peru, in 2005) have suitable satellite-based Interferometric Synthetic Aperture Radar (InSAR) observations. We invert the InSAR geodetic signal and find that the ground deformation associated with these swarms does not require a significant component of aseismic fault slip or magmatic intrusion. Three swarms in the vicinity of the volcanic arc in southern Peru appear to be triggered by the Mw= 8.5 2001 Peru earthquake, but predicted static Coulomb stress changes due to the main shock were very small at the swarm locations, suggesting that dynamic triggering processes may have had a role in their occurrence. Although we identified few swarms in volcanic regions, we suggest that particularly large volcanic swarms (those that

  11. Earthquake sources near Uturuncu Volcano

    Science.gov (United States)

    Keyson, L.; West, M. E.

    2013-12-01

    Uturuncu, located in southern Bolivia near the Chile and Argentina border, is a dacitic volcano that was last active 270 ka. It is a part of the Altiplano-Puna Volcanic Complex, which spans 50,000 km2 and is comprised of a series of ignimbrite flare-ups since ~23 ma. Two sets of evidence suggest that the region is underlain by a significant magma body. First, seismic velocities show a low velocity layer consistent with a magmatic sill below depths of 15-20 km. This inference is corroborated by high electrical conductivity between 10km and 30km. This magma body, the so called Altiplano-Puna Magma Body (APMB) is the likely source of volcanic activity in the region. InSAR studies show that during the 1990s, the volcano experienced an average uplift of about 1 to 2 cm per year. The deformation is consistent with an expanding source at depth. Though the Uturuncu region exhibits high rates of crustal seismicity, any connection between the inflation and the seismicity is unclear. We investigate the root causes of these earthquakes using a temporary network of 33 seismic stations - part of the PLUTONS project. Our primary approach is based on hypocenter locations and magnitudes paired with correlation-based relative relocation techniques. We find a strong tendency toward earthquake swarms that cluster in space and time. These swarms often last a few days and consist of numerous earthquakes with similar source mechanisms. Most seismicity occurs in the top 10 kilometers of the crust and is characterized by well-defined phase arrivals and significant high frequency content. The frequency-magnitude relationship of this seismicity demonstrates b-values consistent with tectonic sources. There is a strong clustering of earthquakes around the Uturuncu edifice. Earthquakes elsewhere in the region align in bands striking northwest-southeast consistent with regional stresses.

  12. "ABC's Earthquake" (Experiments and models in seismology)

    Science.gov (United States)

    Almeida, Ana

    2017-04-01

    Ana Almeida, Portugal Almeida, Ana Escola Básica e Secundária Dr. Vieira de Carvalho Moreira da Maia, Portugal The purpose of this presentation, in poster format, is to disclose an activity which was planned and made by me, in a school on the north of Portugal, using a kit of materials simple and easy to use - the sismo-box. The activity "ABC's Earthquake" was developed under the discipline of Natural Sciences, with students from 7th grade, geosciences teachers and other areas. The possibility of work with the sismo-box was seen as an exciting and promising opportunity to promote science, seismology more specifically, to do science, when using the existing models in the box and with them implement the scientific method, to work and consolidate content and skills in the area of Natural Sciences, to have a time of sharing these materials with classmates, and also with other teachers from the different areas. Throughout the development of the activity, either with students or teachers, it was possible to see the admiration by the models presented in the earthquake-box, as well as, the interest and the enthusiasm in wanting to move and understand what the results after the proposed procedure in the script. With this activity, we managed to promote: - educational success in this subject; a "school culture" with active participation, with quality, rules, discipline and citizenship values; fully integration of students with special educational needs; strengthen the performance of the school as a cultural, informational and formation institution; provide activities to date and innovative; foment knowledge "to be, being and doing" and contribute to a moment of joy and discovery.Learn by doing!

  13. What caused a large number of fatalities in the Tohoku earthquake?

    Science.gov (United States)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    the 1960 Chile tsunami, which was significantly smaller than that of the 11 March tsunami. This sense of "knowing" put their lives at high risk. 5. Some local residents believed that with the presence of a breakwater, only slight flooding would occur. 6. Many people did not understand why tsunami is created under the sea. Therefore, relation of earthquake and tsunami is not quite linked to many people. These interviews made it clear that many deaths resulted because current technology and earthquake science underestimated tsunami heights, warning systems failed, and breakwaters were not strong or high enough. However, even if these problems occur in future earthquakes, better knowledge regarding earthquakes and tsunami hazards could save more lives. In an elementary school when children have fresh brain, it is necessary for them to learn the basic mechanism of tsunami generation.

  14. Wellbeing and the Curriculum: One School's Story Post-­Earthquake

    Science.gov (United States)

    Ormandy, Sally

    2014-01-01

    This is the post-earthquake story of how we as the staff and community of Opawa Primary School have lived with the tremors (literally and metaphorically) and trauma of this national tragedy, whilst endeavouring to maximise student learning and enhance the wellbeing of all the members of our school community. This is not founded on research but is…

  15. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  16. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  17. Estimation of source parameters of Chamoli Earthquake, India

    Indian Academy of Sciences (India)

    R. Narasimhan, Krishtel eMaging Solutions

    experienced two more devastating earthquakes of magnitude greater than 6.0 in the last decade namely the Uttarkashi earthquake in 1991 and the Chamoli earthquake in 1999 (Rajendran et al 2000, Rastogi. 2000). The effect of these earthquakes was felt up to approx. 300 km. in the city of Delhi. In the recent earthquake ...

  18. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  19. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  20. Istanbul Earthquake Early Warning System

    Science.gov (United States)

    Alcik, H.; Mert, A.; Ozel, O.; Erdik, M.

    2007-12-01

    As part of the preparations for the future earthquake in Istanbul a Rapid Response and Early Warning system in the metropolitan area is in operation. For the Early Warning system ten strong motion stations were installed as close as possible to the fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The band-pass filtered accelerations and the cumulative absolute velocity (CAV) are compared with specified threshold levels. When any acceleration or CAV (on any channel) in a given station exceeds specific threshold values it is considered a vote. Whenever we have 2 station votes within selectable time interval, after the first vote, the first alarm is declared. In order to specify the appropriate threshold levels a data set of near field strong ground motions records form Turkey and the world has been analyzed. Correlations among these thresholds in terms of the epicenter distance the magnitude of the earthquake have been studied. The encrypted early warning signals will be communicated to the respective end users. Depending on the location of the earthquake (initiation of fault rupture) and the recipient facility the alarm time can be as high as about 8s. The first users of the early warning signal will be the Istanbul gas company (IGDAS) and the metro line using the immersed tube tunnel (MARMARAY). Other prospective users are power plants and power distribution systems, nuclear research facilities, critical chemical factories, petroleum facilities and high-rise buildings. In this study, different algorithms based on PGA, CAV and various definitions of instrumental intensity will be discussed and triggering threshold levels of these parameters will be studied

  1. Ground Shaking and Earthquake Engineering Aspects of the M 8.8 Chile Earthquake of 2010 - Applications to Cascadia and Other Subduction Zones (Invited)

    Science.gov (United States)

    Cassidy, J. F.; Boroschek, R.; Ventura, C.; Huffman, S.

    2010-12-01

    The M 8.8 Maule, Chile earthquake of February 27, 2010 was the fifth largest earthquake ever recorded by seismographs and provides a rare opportunity to compare strong shaking observations with earthquake rupture and damage patterns. This subduction earthquake was caused by up to 13 m of eastward slip of the Nazca plate beneath the South American plate. The rupture zone extended nearly 600 km along the Chile coast and covered the most populated region of the country - extending from south of Concepcion to just south of Valpraiso (near the latitude of Santiago). As this is the type of earthquake that is expected along the Cascadia subduction zone of western Canada and the U.S., and given that modern building codes and construction styles in Chile and Cascadia are very similar, the Canadian Association of Earthquake Engineers sent a team of 10 engineers and a seismologist to the earthquake zone to learn from this earthquake. In this presentation we focus on sites where strong ground shaking was recorded (the data available to date range from about 0.1g to 0.66g). The recorded waveforms showed strong shaking for up to 2-3 minutes, with two distinct bursts of energy that may correspond to two large asperities that ruptured. At many locations, particularly along the coast, the recorded shaking levels exceeded code values, especially at longer periods (~ 1 second and longer). There was significant damage to older hospitals and schools. Twenty-five hospitals were severely damaged (17 collapsed, 8 repairable) and in the Maule region, 45% of the hospital beds were lost. More than 2500 schools were damaged and more than 780,000 students were affected. Of about 12,000 bridges in Chile, only 40 were damaged, 20 severely (many of these were newer overpasses). Modern high-rise buildings, in general, did very well. Of the 10,000 3-storey or higher buildings constructed since 1985, only 4 collapsed, and 50-150 were badly damaged. This clearly demonstrates the importance of modern

  2. The Canterbury Tales: Lessons from the Canterbury Earthquake Sequence to Inform Better Public Communication Models

    Science.gov (United States)

    McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.

    2015-12-01

    This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of

  3. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    Science.gov (United States)

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  4. Why local people did not present a problem in the 2016 Kumamoto earthquake, Japan though people accused in the 2009 L'Aquila earthquake?

    Science.gov (United States)

    Sugimoto, M.

    2016-12-01

    Risk communication is a big issues among seismologists after the 2009 L'Aquila earthquake all over the world. A lot of people remember 7 researchers as "L'Aquila 7" were accused in Italy. Seismologists said it is impossible to predict an earthquake by science technology today and join more outreach activities. "In a subsequent inquiry of the handling of the disaster, seven members of the Italian National Commission for the Forecast and Prevention of Major Risks were accused of giving "inexact, incomplete and contradictory" information about the danger of the tremors prior to the main quake. On 22 October 2012, six scientists and one ex-government official were convicted of multiple manslaughter for downplaying the likelihood of a major earthquake six days before it took place. They were each sentenced to six years' imprisonment (Wikipedia)". Finally 6 scientists are not guilty. The 2016 Kumamoto earthquake hit Kyushu, Japan in April. They are very similar seismological situations between the 2016 Kumamoto earthquake and the 2009 L'Aquila earthquake. The foreshock was Mj6.5 and Mw6.2 in 14 April 2016. The main shock was Mj7.3 and Mw7.0. Japan Metrological Agency (JMA) misleaded foreshock as mainshock before main shock occured. 41 people died by the main shock in Japan. However local people did not accused scientists in Japan. It has been less big earhquakes around 100 years in Kumamoto. Poeple was not so matured that they treated earthquake information in Kyushu, Japan. How are there differences between Japan and Italy? We learn about outreach activities for sciencits from this case.

  5. Great East Japan Earthquake Tsunami

    Science.gov (United States)

    Iijima, Y.; Minoura, K.; Hirano, S.; Yamada, T.

    2011-12-01

    The 11 March 2011, Mw 9.0 Great East Japan Earthquake, already among the most destructive earthquakes in modern history, emanated from a fault rupture that extended an estimated 500 km along the Pacific coast of Honshu. This earthquake is the fourth among five of the strongest temblors since AD 1900 and the largest in Japan since modern instrumental recordings began 130 years ago. The earthquake triggered a huge tsunami, which invaded the seaside areas of the Pacific coast of East Japan, causing devastating damages on the coast. Artificial structures were destroyed and planted forests were thoroughly eroded. Inrush of turbulent flows washed backshore areas and dunes. Coastal materials including beach sand were transported onto inland areas by going-up currents. Just after the occurrence of the tsunami, we started field investigation of measuring thickness and distribution of sediment layers by the tsunami and the inundation depth of water in Sendai plain. Ripple marks showing direction of sediment transport were the important object of observation. We used a soil auger for collecting sediments in the field, and sediment samples were submitted for analyzing grain size and interstitial water chemistry. Satellite images and aerial photographs are very useful for estimating the hydrogeological effects of tsunami inundation. We checked the correspondence of micro-topography, vegetation and sediment covering between before and after the tsunami. The most conspicuous phenomenon is the damage of pine forests planted in the purpose of preventing sand shifting. About ninety-five percent of vegetation coverage was lost during the period of rapid currents changed from first wave. The landward slopes of seawalls were mostly damaged and destroyed. Some aerial photographs leave detailed records of wave destruction just behind seawalls, which shows the occurrence of supercritical flows. The large-scale erosion of backshore behind seawalls is interpreted to have been caused by

  6. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  7. Earthquake lights and rupture processes

    Directory of Open Access Journals (Sweden)

    T. V. Losseva

    2005-01-01

    Full Text Available A physical model of earthquake lights is proposed. It is suggested that the magnetic diffusion from the electric and magnetic fields source region is a dominant process, explaining rather high localization of the light flashes. A 3D numerical code allowing to take into account the arbitrary distribution of currents caused by ground motion, conductivity in the ground and at its surface, including the existence of sea water above the epicenter or (and near the ruptured segments of the fault have been developed. Simulations for the 1995 Kobe earthquake were conducted taking into account the existence of sea water with realistic geometry of shores. The results do not contradict the eyewitness reports and scarce measurements of the electric and magnetic fields at large distances from the epicenter.

  8. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  9. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  10. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  11. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  12. Intelligent earthquake data processing for global adjoint tomography

    Science.gov (United States)

    Chen, Y.; Hill, J.; Li, T.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Due to the increased computational capability afforded by modern and future computing architectures, the seismology community is demanding a more comprehensive understanding of the full waveform information from the recorded earthquake seismograms. Global waveform tomography is a complex workflow that matches observed seismic data with synthesized seismograms by iteratively updating the earth model parameters based on the adjoint state method. This methodology allows us to compute a very accurate model of the earth's interior. The synthetic data is simulated by solving the wave equation in the entire globe using a spectral-element method. In order to ensure the inversion accuracy and stability, both the synthesized and observed seismograms must be carefully pre-processed. Because the scale of the inversion problem is extremely large and there is a very large volume of data to both be read and written, an efficient and reliable pre-processing workflow must be developed. We are investigating intelligent algorithms based on a machine-learning (ML) framework that will automatically tune parameters for the data processing chain. One straightforward application of ML in data processing is to classify all possible misfit calculation windows into usable and unusable ones, based on some intelligent ML models such as neural network, support vector machine or principle component analysis. The intelligent earthquake data processing framework will enable the seismology community to compute the global waveform tomography using seismic data from an arbitrarily large number of earthquake events in the fastest, most efficient way.

  13. Crowdsourcing earthquake damage assessment using remote sensing imagery

    Directory of Open Access Journals (Sweden)

    Stuart Gill

    2011-06-01

    Full Text Available This paper describes the evolution of recent work on using crowdsourced analysis of remote sensing imagery, particularly high-resolution aerial imagery, to provide rapid, reliable assessments of damage caused by earthquakes and potentially other disasters. The initial effort examined online imagery taken after the 2008 Wenchuan, China, earthquake. A more recent response to the 2010 Haiti earthquake led to the formation of an international consortium: the Global Earth Observation Catastrophe Assessment Network (GEO-CAN. The success of GEO-CAN in contributing to the official damage assessments made by the Government of Haiti, the United Nations, and the World Bank led to further development of a web-based interface. A current initiative in Christchurch, New Zealand, is underway where remote sensing experts are analyzing satellite imagery, geotechnical engineers are marking liquefaction areas, and structural engineers are identifying building damage. The current site includes online training to improve the accuracy of the assessments and make it possible for even novice users to contribute to the crowdsourced solution. The paper discusses lessons learned from these initiatives and presents a way forward for using crowdsourced remote sensing as a tool for rapid assessment of damage caused by natural disasters around the world.

  14. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    OpenAIRE

    Trugman, Daniel T

    2017-01-01

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical feature...

  15. 3. Waveform and Spectral Features of Earthquake Swarms and Foreshocks : in Special Reference to Earthquake Prediction

    OpenAIRE

    Tsujiura, Masaru

    1983-01-01

    Through the analyses of waveforms and spectra for the earthquake swarm, foreshock and ordinary seismic activities, some differences in the activity mode are found among those activities. The most striking difference is the ""similarity of waveform"". The earthquake swarm activity which occurred in a certain short time interval mainly consists of events with similar waveforms, belonging to the event group called ""similar earthquakes"" or an ""earthquake family"". On the other hand, the foresh...

  16. Local earthquake tomography of Scotland

    Science.gov (United States)

    Luckett, Richard; Baptie, Brian

    2015-03-01

    Scotland is a relatively aseismic region for the use of local earthquake tomography, but 40 yr of earthquakes recorded by a good and growing network make it possible. A careful selection is made from the earthquakes located by the British Geological Survey (BGS) over the last four decades to provide a data set maximising arrival time accuracy and ray path coverage of Scotland. A large number of 1-D velocity models with different layer geometries are considered and differentiated by employing quarry blasts as ground-truth events. Then, SIMULPS14 is used to produce a robust 3-D tomographic P-wave velocity model for Scotland. In areas of high resolution the model shows good agreement with previously published interpretations of seismic refraction and reflection experiments. However, the model shows relatively little lateral variation in seismic velocity except at shallow depths, where sedimentary basins such as the Midland Valley are apparent. At greater depths, higher velocities in the northwest parts of the model suggest that the thickness of crust increases towards the south and east. This observation is also in agreement with previous studies. Quarry blasts used as ground truth events and relocated with the preferred 3-D model are shown to be markedly more accurate than when located with the existing BGS 1-D velocity model.

  17. Pre-earthquake Magnetic Pulses

    Science.gov (United States)

    Scoville, J.; Heraud, J. A.; Freund, F. T.

    2015-12-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earth quakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  18. Earthquake-protective pneumatic foundation

    Science.gov (United States)

    Shustov, Valentin

    2000-04-01

    The main objective of the research in progress is to evaluate the applicability of an innovative earthquake-protective system called pneumatic foundation to building construction and industrial equipment. The system represents kind of seismic soil isolation. The research is analytical and accompanied with limited testing on a shake table. The concept of partial suppression of seismic energy flow inside a structure is known as a seismic or base isolation. Normally, this technique needs some pads to be inserted into all major load-carrying elements in a base of the building. It also requires creating additional rigidity diaphragms in the basement and a moat around the building, as well as making additional provisions against overturning and/or P-(Delta ) effect. Besides, potential benefits of base isolation techniques should not be taken for granted: they depend on many internal and external factors. The author developed a new earthquake protective technique called pneumatic foundation. Its main components are: a horizontal protective layer located under the footing at a certain depth, and a vertical one installed along the horizontal protective layer perimeter. The first experiments proved a sizable screening effect of pneumatic foundation: two identical models of a steel frame building, put simultaneously on the same vibrating support simulating an earthquake, performed in a strikingly different manner: while the regular building model shook vigorously, the model on a pneumatic foundation just slightly trembled.

  19. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  20. If pandas scream. an earthquake is coming

    Energy Technology Data Exchange (ETDEWEB)

    Magida, P.

    Feature article:Use of the behavior of animals to predict weather has spanned several ages and dozens of countries. While animals may behave in diverse ways to indicate weather changes, they all tend to behave in more or less the same way before earthquakes. The geophysical community in the U.S. has begun testing animal behavior before earthquakes. It has been determined that animals have the potential of acting as accurate geosensors to detect earthquakes before they occur. (5 drawings)

  1. April 25, 2015, Gorkha Earthquake, Nepal and Sequence of Aftershocks: Key Lessons

    Science.gov (United States)

    Guragain, R.; Dixit, A. M.; Shrestha, S. N.

    2015-12-01

    The Gorkha Earthquake of M7.8 hit Nepal on April 25, 2015 at 11:56 am local time. The epicenter of this earthquake was Barpak, Gorkha, 80 km northwest of Kathmandu Valley. The main shock was followed by hundreds of aftershocks including M6.6 and M6.7 within 48 hours and M7.3 on May 12, 2015. According to the Government of Nepal, a total of 8,686 people lost their lives, 16,808 people injured, over 500,000 buildings completely collapsed and more than 250,000 building partially damaged. The National Society for Earthquake Technology - Nepal (NSET), a not-for-profit civil society organization that has been focused on earthquake risk reduction in Nepal for past 21 years, conducted various activities to support people and the government in responding to the earthquake disaster. The activities included: i) assisting people and critical facility institutions to conduct rapid visual building damage assessment including the training; ii) information campaign to provide proper information regarding earthquake safety; iii) support rescue organizations on search and rescue operations; and iv) provide technical support to common people on repair, retrofit of damaged houses. NSET is also involved in carrying out studies related to earthquake damage, geotechnical problems, and causes of building damages. Additionally, NSET has done post-earthquake detail damage assessment of buildings throughout the affected areas. Prior to the earthquake, NSET has been working with several institutions to improve seismic performance of school buildings, private residential houses, and other critical structures. Such activities implemented during the past decade have shown the effectiveness of risk reduction. Retrofitted school buildings performed very well during the earthquake. Preparedness activities implemented at community levels have helped communities to respond immediately and save lives. Higher level of earthquake awareness achieved including safe behavior, better understanding of

  2. Near-real-time Earthquake Notification and Response in the Classroom: Exploiting the Teachable Moment

    Science.gov (United States)

    Furlong, K. P.; Whitlock, J. S.; Benz, H. M.

    2002-12-01

    Earthquakes occur globally, on a regular but (as yet) non-predictable basis, and their effects are both dramatic and often devastating. Additionally they serve as a primary tool to image the earth and define the active processes that drive tectonics. As a result, earthquakes can be an extremely effective tool for helping students to learn about active earth processes, natural hazards, and the myriad of issues that arise with non-predictable but potentially devastating natural events. We have developed and implemented a real-time earthquake alert system (EAS) built on the USGS Earthworm system to bring earthquakes into the classroom. Through our EAS, students in our General Education class on Natural Hazards (Earth101 - Natural Disasters: Hollywood vs. Reality) participate in earthquake response activities in ways similar to earthquake hazard professionals - they become part of the response to the event. Our implementation of the Earthworm system allows our students to be paged via cell-phone text messaging (Yes, we provide cell phones to the 'duty seismologists'), and they respond to those pages as appropriate for their role. A parallel web server is maintained that provides the earthquake details (location maps, waveforms etc.) and students produce time-critical output such as news releases, analyses of earthquake trends in the region, and reports detailing implications of the events. Since this is a course targeted at non-science majors, we encourage that they bring their own expertise into the analyses. For example, business of economic majors may investigate the economic impacts of an earthquake, secondary education majors may work on teaching modules based on the information they gather etc. Since the students know that they are responding to real events they develop ownership of the information they gather and they recognize the value of real-time response. Our educational goals in developing this system include: (1) helping students develop a sense of the

  3. Methodology to determine the parameters of historical earthquakes in China

    Science.gov (United States)

    Wang, Jian; Lin, Guoliang; Zhang, Zhe

    2017-12-01

    China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

  4. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  5. The 2015 Gorkha (Nepal) earthquake: unfinished business Large ...

    Indian Academy of Sciences (India)

    jaj2

    The 2015 Gorkha (Nepal) earthquake: unfinished business. Large earthquakes in the Himalaya and India. James Jackson, Bullard Laboratories, University of Cambridge ... Jackson, GSA Today, 2002; Sloan et al GJI, 2011. Moho depth. Earthquake depth ...

  6. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  7. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  8. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  9. Evaluation and cataloging of Korean historical earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kew Hwa; Han, Young Woo; Lee, Jun Hui; Park, Ji Eok; Na, Kwang Wooing; Shin, Byung Ju [The Reaearch Institute of Basic Sciences, Seoul Nationl Univ., Seoul (Korea, Republic of)

    1998-03-15

    In order to systematically collect and analyze the historical earthquake data of the Korean peninsula which are very important in analyzing the seismicity and seismic risk of the peninsula by seismologist and historian, extensive governmental and private historical documents are investigated and relative reliabilities of these documents are examined. This research unearthed about 70 new earthquake records and revealed the change in the cultural, political and social effects of earthquakes with time in Korea. Also, the results of the vibration test of the Korean traditional wooden house are obtained in order to better estimate intensities of the historical earthquakes.

  10. Smoking Prevalence Increases following Canterbury Earthquakes

    Directory of Open Access Journals (Sweden)

    Nick Erskine

    2013-01-01

    Full Text Available Background. A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents’ living, working, and social conditions. Aim. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Methods. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. Results. In August 2010, prior to any earthquake, 409 (41% participants had never smoked, 273 (27% were currently smoking, and 316 (32% were ex-smokers. Since the September 2010 earthquake, 76 (24% of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2% had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1% had increased consumption following the earthquake, 94 (34.4% had not changed, and 86 (31.5% had decreased their consumption. 53 (57% of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. Conclusion. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  11. Thermal Infrared Anomalies of Several Strong Earthquakes

    Directory of Open Access Journals (Sweden)

    Congxin Wei

    2013-01-01

    Full Text Available In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1 There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2 There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3 Thermal radiation anomalies are closely related to the geological structure. (4 Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  12. A minimalist model of characteristic earthquakes

    Directory of Open Access Journals (Sweden)

    M. Vázquez-Prada

    2002-01-01

    Full Text Available In a spirit akin to the sandpile model of self-organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time of the characteristic earthquake.

  13. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  14. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM ~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  15. Thermal Infrared Anomalies of Several Strong Earthquakes

    Science.gov (United States)

    Wei, Congxin; Guo, Xiao; Qin, Manzhong

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. PMID:24222728

  16. Call for action for setting up an infectious disease control action plan for disaster area activities: learning from the experience of checking suffering volunteers in the field after the Great East Japan Earthquake.

    Science.gov (United States)

    Takahashi, Kenzo; Kodama, Mitsuya; Kanda, Hideyuki

    2013-12-01

    After the Great East Japan Earthquake on March 11th, 2011, a journalist visited the disaster area with febrile symptoms and was diagnosed with measles of the D genotype, which is not indigenous to Japan. After continuing activities in disaster areas and Tokyo, 11 measles cases were reported, some of which were identified as genotype D. Meanwhile non-profit activities directed towards volunteers were offered including interviews to screen for subjective symptoms, check body temperature and advise volunteers to refrain from working in shelter areas during the period of sickness. As a consequence, disease transmission was controlled among volunteers. In disaster areas, anyone can be an infection vector. In order to prevent transmission of infectious diseases, a field action plan, which includes body temperature checks and standard precautions should be formulated and put into place. If the action plans are shared among local governments and non-governmental organizations (NGOs), they can become a norm and be expected to control infectious disease transmission.

  17. Limiting the effects of earthquakes on gravitational-wave interferometers

    Science.gov (United States)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  18. A Bayesian Approach to Real-Time Earthquake Phase Association

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  19. Limiting the effects of earthquakes on gravitational-wave interferometers

    Science.gov (United States)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-02-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  20. Insurance Stock Prices Following the 1994 Los Angeles Earthquake

    OpenAIRE

    Thomas A. Aiuppa; Thomas M. Krueger

    1995-01-01

    This study examines the changes in insurance firm value following the 1994 Los Angeles earthquake. While prior studies found that the 1989 San Francisco earthquake was associated with an increase in earthquake insurers’ firm value, the findings of this study indicate that earthquake firms sustained their value following the 1994 earthquake. These results and their implications provide insight for investors, regulators, and other policymakers as regards future earthquakes.

  1. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    Science.gov (United States)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  2. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  3. Stress Drops for Potentially Induced Earthquake Sequences

    Science.gov (United States)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  4. Estimation of Maximum Magnitudes of Subduction Earthquakes

    Science.gov (United States)

    Muldashev, Iskander; Sobolev, Stephan

    2017-04-01

    Even though methods of instrumentally observing earthquakes at subduction zones have rapidly improved in recent decades, the characteristic recurrence interval of giant subduction earthquakes (Mw>8.5) is much larger than the currently available observational record and therefore the necessary conditions for giant earthquakes are not clear. However, the statistical studies have recognized the importance of the slab shape and its surface roughness, state of the strain of the upper plate and thickness of sediments filling the trenches. Here we apply cross-scale seismic cycle modeling technique (Sobolev and Muldashev, under review) to study key factors controlling maximum magnitudes of earthquakes in subduction zones. Our models employ elasticity, non-linear transient viscous rheology and rate-and-state friction. They generate spontaneous earthquake sequences and by using adaptive time-step algorithm, recreate the deformation process as observed naturally during seismic cycle and multiple seismic cycles. We explore effects of slab geometry, megathrust friction coefficients, and convergence rates on the magnitude of earthquakes. We found that the low-angle subduction (largest effect) and low static friction, likely caused by thick sediments in the subduction channel (smaller effect) are the key factors controlling magnitude of great earthquakes, while the change of subduction velocity from 10 to 3.5 cm/yr has much lower effect. Modeling results also suggest that thick sediments in the subduction channel causing low static friction, result in neutral or compressive deformation in the overriding plate for low-angle subduction zones in agreement with observations for the giant earthquakes. The model also predicts the magnitudes of the largest possible earthquakes for subduction zones of given dipping angles. We demonstrate that our predictions are consistent with all known giant subduction earthquakes of 20th and 21st centuries and with estimations for historical

  5. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    Science.gov (United States)

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  6. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    Science.gov (United States)

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  7. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  8. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  9. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  10. Forecasting characteristic earthquakes in a minimalist model

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; Pacheco, A.; González, Á.

    2003-01-01

    Using error diagrams, we quantify the forecasting of characteristic-earthquake occurence in a recently introduced minimalist model. Initially we connect the earthquake alarm at a fixed time after the occurence of a characteristic event. The evaluation of this strategy leads to a one...

  11. Refresher Course on Physics of Earthquakes -98 ...

    Indian Academy of Sciences (India)

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  12. Wood-framed houses for earthquake zones

    DEFF Research Database (Denmark)

    Hansen, Klavs Feilberg

    Wood-framed houses with a sheathing are suitable for use in earthquake zones. The Direction describes a method of determining the earthquake forces in a house and shows how these forces can be resisted by diaphragm action in the walls, floors, and roof, of the house. An appendix explains how...

  13. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  14. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  15. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    -earthquake behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...

  16. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  17. Rapid Inventory of Earthquake Damage (RIED)

    NARCIS (Netherlands)

    Duque, Adriana; Hack, Robert; Montoya, L.; Scarpas, Tom; Slob, Siefko; Soeters, Rob; van Westen, Cees

    2001-01-01

    The 25 January 1999 Quindío earthquake in Colombia was a major disaster for the coffee-growing region in Colombia. Most of the damage occurred in the city of Armenia and surrounding villages. Damage due to earthquakes is strongly related to topographic and subsurface geotechnical conditions

  18. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    stress drop values are quite large compared to the other similar size Indian intraplate earthquakes, which can be attributed ... Earthquake source parameters; crustal Q-value; simultaneous inversion; S-wave spectra; aftershocks. J. Earth Syst. Sci. ...... 28 1339–1342. Lee W H K and Valdes C M 1985 HYP071PC: A personal.

  19. The Earthquake Preparedness Task Force Report. Revised.

    Science.gov (United States)

    Roybal-Allard, Lucille

    A report on Earthquake Preparedness presents California school districts with direction for complying with existing earthquake preparedness planning laws. It first contains two sets of recommendations. The first set requires state action and is presented to the Legislature for consideration. The second set consists of policy statements and…

  20. Earthquake effect on the geological environment

    International Nuclear Information System (INIS)

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  1. Earthquake engineering research program in Chile

    Science.gov (United States)

    Saragoni, G. R.

    1982-01-01

    Earthquake engineering research in Chile has been carried out for more than 30 years. Systematic research is done at the university of Chile in Santiago. Other universities such as the Catholic University, university of Concepcion, and the Federico Santa Maria Technical University have begun to teach and conduct research in earthquake engineering in recent years. 

  2. Elevated Tank Due to Earthquake Even

    Directory of Open Access Journals (Sweden)

    Kotrasová Kamila

    2017-12-01

    Full Text Available Elevated reservoirs are mainly used for storing of variety water. During earthquake activity the fluid exerts impulsive and convective (sloshing effects on the walls and bottom of tank. This paper provides theoretical background for analytical calculating of elevated water tank due to earthquake even and deals with simplified seismic design procedures for elevated tanks.

  3. Tutorial on earthquake rotational effects: historical examples

    Czech Academy of Sciences Publication Activity Database

    Kozák, Jan

    2009-01-01

    Roč. 99, 2B (2009), s. 998-1010 ISSN 0037-1106 Institutional research plan: CEZ:AV0Z30120515 Keywords : rotational seismic models * earthquake rotational effects * historical earthquakes Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.860, year: 2009

  4. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    This paper presents the simultaneous estimation of source parameters and crustal Q values for small to moderate-size aftershocks ( 2.1–5.1) of the 7.7 2001 Bhuj earthquake. The horizontal-component S-waves of 144 well located earthquakes (2001–2010) recorded at 3–10 broadband seismograph sites in the ...

  5. Post-earthquake inspection of utility buildings

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, E.; Cluff, L.; Savage, W. [Pacific Gas and Electric Co., San Francisco, CA (United States). Geosciences Dept.; Poland, C. [Degenkolb Engineers, San Francisco, CA (United States)

    1995-12-31

    The evacuation of safe buildings and the inability to reoccupy them immediately after earthquakes can have significant impacts on lifeline utilities, including delays in the restoration of essential services. For many of Pacific Gas and Electric Company`s 3400 buildings, the potential for unnecessary evacuations and delays in reentry was judged unacceptable. A Post-Earthquake Investigation Program, developed jointly by PG and E and Degenkolb Engineers, facilitates the post-earthquake use of essential buildings. The details of the program were developed taking into consideration the effects of high-likelihood scenario earthquakes on PG and E facilities, and the potential disruption of transportation and communication systems. Qualified engineers were pre-assigned to inspect key facilities following prescribed earthquakes. The inspections will be facilitated by pre-earthquake evaluations and post-earthquake manuals. Building department personnel support the program, because it promotes the timely and accurate assessment of essential buildings within their jurisdiction. The program was developed for a gas and electric utility; however, it is applicable to other organizations in earthquake regions.

  6. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    This paper presents the simultaneous estimation of source parameters and crustal Q values for small to moderate-size aftershocks (Mw 2.1–5.1) of the Mw 7.7 2001 Bhuj earthquake. The horizontal-component. S-waves of 144 well located earthquakes (2001–2010) recorded at 3–10 broadband seismograph sites in.

  7. Bayesian exploration of recent Chilean earthquakes

    Science.gov (United States)

    Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Liang, Cunren; Agram, Piyush; Owen, Susan; Ortega, Francisco; Minson, Sarah

    2016-04-01

    The South-American subduction zone is an exceptional natural laboratory for investigating the behavior of large faults over the earthquake cycle. It is also a playground to develop novel modeling techniques combining different datasets. Coastal Chile was impacted by two major earthquakes in the last two years: the 2015 M 8.3 Illapel earthquake in central Chile and the 2014 M 8.1 Iquique earthquake that ruptured the central portion of the 1877 seismic gap in northern Chile. To gain better understanding of the distribution of co-seismic slip for those two earthquakes, we derive joint kinematic finite fault models using a combination of static GPS offsets, radar interferograms, tsunami measurements, high-rate GPS waveforms and strong motion data. Our modeling approach follows a Bayesian formulation devoid of a priori smoothing thereby allowing us to maximize spatial resolution of the inferred family of models. The adopted approach also attempts to account for major sources of uncertainty in the Green's functions. The results reveal different rupture behaviors for the 2014 Iquique and 2015 Illapel earthquakes. The 2014 Iquique earthquake involved a sharp slip zone and did not rupture to the trench. The 2015 Illapel earthquake nucleated close to the coast and propagated toward the trench with significant slip apparently reaching the trench or at least very close to the trench. At the inherent resolution of our models, we also present the relationship of co-seismic models to the spatial distribution of foreshocks, aftershocks and fault coupling models.

  8. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  9. The 2010 Haiti earthquake response.

    Science.gov (United States)

    Raviola, Giuseppe; Severe, Jennifer; Therosme, Tatiana; Oswald, Cate; Belkin, Gary; Eustache, Eddy

    2013-09-01

    This article presents an overview of the mental health response to the 2010 Haiti earthquake. Discussion includes consideration of complexities that relate to emergency response, mental health and psychosocial response in disasters, long-term planning of systems of care, and the development of safe, effective, and culturally sound mental health services in the Haitian context. This information will be of value to mental health professionals and policy specialists interested in mental health in Haiti, and in the delivery of mental health services in particularly resource-limited contexts in the setting of disasters. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Development of an earthquake catalog management program

    International Nuclear Information System (INIS)

    Eum, H. S.; Choi, I. K.

    1999-01-01

    Earthquake Catalog Management Program was developed for earthquake engineering and research. The program is composed of catalog database and application program. Catalog database currently has more than 720 catalog records from earthquake data recorded between 1994/12 and 1998/5 in korea. 17 parameters derived from earthquake data constitute each record. These parameters in database include information on the triggering events, recording station, and station specific recorded values. Catalog database also has information on 12 recording stations. Application program is a tool for accessing and managing the catalog database and recorded earthquake data files. The program provides various functions such as search, sort, display capabilities of catalog subset, file retrieval from hard disks or CD-ROM, file type conversion, and multiple output options including computer screen, printer, and disk files

  11. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  12. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  13. Earthquake Prediction Techniques: Their Application in Japan

    Science.gov (United States)

    Kisslinger, Carl

    Japan is serious about solving the earthquake prediction problem. A well-organized and well-funded program of research has been under way for almost 20 years in pursuit of the national goal of protecting the dense population of this earthquake-prone country through reliable predictions.This rather amazing book, edited by Toshi Asada, retired director of the Geophysical Institute of the University of Tokyo, has been written by 10 scientists, each of whom has made important contributions to earthquake science, but who have not been known in the past as principal spokesmen for the Japanese earthquake prediction program. The result is a combination of a very readable tutorial presentation of basic earthquake science that will make the book understandable to the nonspecialist, a good summary of Japanese data and research conclusions, and a bare-knuckles appraisal of current philosophy and strategy for prediction in Japan.

  14. Low cost earthquake resistant ferrocement small house

    International Nuclear Information System (INIS)

    Saleem, M.A.; Ashraf, M.; Ashraf, M.

    2008-01-01

    The greatest humanitarian challenge faced even today after one year of Kashmir Hazara earthquake is that of providing shelter. Currently on the globe one in seven people live in a slum or refugee camp. The earthquake of October 2005 resulted in a great loss of life and property. This research work is mainly focused on developing a design of small size, low cost and earthquake resistant house. Ferrocement panels are recommended as the main structural elements with lightweight truss roofing system. Earthquake resistance is ensured by analyzing the structure on ETABS for a seismic activity of zone 4. The behavior of structure is found satisfactory under the earthquake loading. An estimate of cost is also presented which shows that it is an economical solution. (author)

  15. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  16. Report on the 2010 Chilean earthquake and tsunami response

    Science.gov (United States)

    ,

    2011-01-01

    disaster response strategies and operations of Chilean agencies, including perceived or actual failures in disaster preparation that impacted the medical disaster response; post-disaster health and medical interventions to save lives and limit suffering; and the lessons learned by public health and medical personnel as a result of their experiences. Despite devastating damage to the health care and civic infrastructure, the health care response to the Chilean earthquake appeared highly successful due to several factors. Like other first responders, the medical community had the ability and resourcefulness to respond without centralized control in the early response phase. The health care community maintained patient care under austere conditions, despite many obstacles that could have prevented such care. National and international resources were rapidly mobilized to support the medical response. The Emergency Services Team sought to collect information on all phases of emergency management (preparedness, mitigation, response, and recovery) and determine what worked well and what could be improved upon. The Chileans reported being surprised that they were not as ready for this event as they thought they were. The use of mass care sheltering was limited, given the scope of the disaster, because of the resiliency of the population. The impacts of the earthquake and the tsunami were quite different, as were the needs of urban and rural dwellers, necessitating different response activities. The Volunteer Services Team examined the challenges faced in mobilizing a large number of volunteers to assist in the aftermath of a disaster of this scale. One of the greatest challenges expressed was difficulty in communication; the need for redundancy in communication mechanisms was cited. The flexibility and ability to work autonomously by the frontline volunteers was a significant factor in effective response. It was also important for volunteer leadership to know the emergency plans

  17. Rapid Assessment of Earthquakes with Radar and Optical Geodetic Imaging and Finite Fault Models (Invited)

    Science.gov (United States)

    Fielding, E. J.; Sladen, A.; Simons, M.; Rosen, P. A.; Yun, S.; Li, Z.; Avouac, J.; Leprince, S.

    2010-12-01

    Earthquake responders need to know where the earthquake has caused damage and what is the likely intensity of damage. The earliest information comes from global and regional seismic networks, which provide the magnitude and locations of the main earthquake hypocenter and moment tensor centroid and also the locations of aftershocks. Location accuracy depends on the availability of seismic data close to the earthquake source. Finite fault models of the earthquake slip can be derived from analysis of seismic waveforms alone, but the results can have large errors in the location of the fault ruptures and spatial distribution of slip, which are critical for estimating the distribution of shaking and damage. Geodetic measurements of ground displacements with GPS, LiDAR, or radar and optical imagery provide key spatial constraints on the location of the fault ruptures and distribution of slip. Here we describe the analysis of interferometric synthetic aperture radar (InSAR) and sub-pixel correlation (or pixel offset tracking) of radar and optical imagery to measure ground coseismic displacements for recent large earthquakes, and lessons learned for rapid assessment of future events. These geodetic imaging techniques have been applied to the 2010 Leogane, Haiti; 2010 Maule, Chile; 2010 Baja California, Mexico; 2008 Wenchuan, China; 2007 Tocopilla, Chile; 2007 Pisco, Peru; 2005 Kashmir; and 2003 Bam, Iran earthquakes, using data from ESA Envisat ASAR, JAXA ALOS PALSAR, NASA Terra ASTER and CNES SPOT5 satellite instruments and the NASA/JPL UAVSAR airborne system. For these events, the geodetic data provided unique information on the location of the fault or faults that ruptured and the distribution of slip that was not available from the seismic data and allowed the creation of accurate finite fault source models. In many of these cases, the fault ruptures were on previously unknown faults or faults not believed to be at high risk of earthquakes, so the area and degree of

  18. Design Games to Learn

    DEFF Research Database (Denmark)

    Marchetti, Emanuela; Valente, Andrea

    2014-01-01

    In this paper we argue that there is a need for digital games that could be easy to alter by young learners. Unfortunately it was found that digital games do not enable children to express their creativity at full, in contrast with low-fidelity prototypes and non-digital toys (such as card or table...... top games). Therefore, we propose here a middle ground between digital and traditional table top games, so to grant children more freedom to express themselves, articulate their understanding and difficulties individually or socially; this approach is an alternative to the current trend of associating...... programming with digital creativity. In our preliminary study we transposed a digital game into a card game and observed students while shifting between playing and design thinking. Results from this study suggest that the notion of altering a digital game through a card-based transposition of the same game...

  19. Learning via Game Design

    DEFF Research Database (Denmark)

    Marchetti, Emanuela; Valente, Andrea

    2015-01-01

    In this paper we consider the problem of making design of digital games accessible to primary school children and their teachers, and we argue for the need of digital games that are easy to alter by young learners. We know from previous research projects that digital games do not enable children...... to express their creativity at full, in contrast with low-fidelity prototypes and non-digital toys (such as card or table top games). Therefore, we propose here a novel approach that serves as a middle ground between digital and traditional table top games, and grants children more freedom to express...... themselves, articulate their understanding and difficulties both individually and socially. This approach, called card-based model for digital game design, is an alternative to the current trend of associating programming with digital creativity. A preliminary study was conducted by transposing a digital...

  20. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  1. Earthquake activity along the Himalayan orogenic belt

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  2. Literature review of health impact post-earthquakes in China 1906-2007.

    Science.gov (United States)

    Chan, E Y Y; Gao, Y; Griffiths, S M

    2010-03-01

    Over the last 100 years, China has experienced the world's three most fatal earthquakes. The Sichuan Earthquake in May 2008 once again reminded us of the huge human toll geological disaster can lead to. In order to learn lessons about the impact of earthquakes on health in China during the past century, we conducted a bilingual literature search of the publicly available health-related disaster databases published between 1906 and 2007. Our search found that research was limited and there were major gaps in the published literature about the impact on health in the post-earthquake period. However, the experiences recorded were similar to those of other parts of the world. The available studies provide useful information about preparedness and rapid early response. Gaps identified included care of chronic disease. Our literature review highlights the paucity of literature on the impact on health post-earthquake in China between 1906 and 2007. Disaster mitigation policies need to reflect the needs not only of the disaster-related impacts on health but also of the ongoing health needs of the chronically ill and to establish safeguards for the well-being of the vulnerable populations.

  3. Statistics of Earthquake Influence on Buildings by means of Seismic Acceleration

    Science.gov (United States)

    Pentaris, Fragkiskos P.; Makris, John P.

    2014-05-01

    This work aims to investigate the statistics of earthquake influence on buildings by studying the correlation of earthquake parameters (magnitude, epicentral distance, azimuth, depth) with the observed seismic acceleration of different floors of a building, as well as of buildings of different age. Crete is on the Hellenic arc, a region with very high seismicity. The study exploits the significant and miscellaneous seismicity of the Southern Hellenic Arc (Greece). Structural Health Monitoring Systems (SHMs), composed by high sensitivity accelerometers, are installed in two different age neighboring buildings, each one consisting two floors, of the Technological Educational Institute of Crete (TEI) located in a suburb of the Chania city (Western Crete). Both SHMs are continuously operating more than a year having recorded a great amount of seismic acceleration data from low, medium and high magnitude earthquakes, featuring various epicentral distances and azimuths. A detailed statistical analysis is being performed in order to correlate the seismic responses of the two buildings, characterized by different vulnerability, with key-parameters of associated earthquakes. Furthermore, we examine the earthquake influence on the two buildings before and after a major nearby seismic event to investigate a possible change in the buildings vulnerability. Acknowledgements This research has been co-financed by the European Union (European Social Fund - ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.

  4. Review Article: A comparison of flood and earthquake vulnerability assessment indicators

    Directory of Open Access Journals (Sweden)

    M. C. de Ruiter

    2017-07-01

    Full Text Available In a cross-disciplinary study, we carried out an extensive literature review to increase understanding of vulnerability indicators used in the disciplines of earthquake- and flood vulnerability assessments. We provide insights into potential improvements in both fields by identifying and comparing quantitative vulnerability indicators grouped into physical and social categories. Next, a selection of index- and curve-based vulnerability models that use these indicators are described, comparing several characteristics such as temporal and spatial aspects. Earthquake vulnerability methods traditionally have a strong focus on object-based physical attributes used in vulnerability curve-based models, while flood vulnerability studies focus more on indicators applied to aggregated land-use classes in curve-based models. In assessing the differences and similarities between indicators used in earthquake and flood vulnerability models, we only include models that separately assess either of the two hazard types. Flood vulnerability studies could be improved using approaches from earthquake studies, such as developing object-based physical vulnerability curve assessments and incorporating time-of-the-day-based building occupation patterns. Likewise, earthquake assessments could learn from flood studies by refining their selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies for exploring risk assessment methodologies across different hazard types.

  5. Review Article: A comparison of flood and earthquake vulnerability assessment indicators

    Science.gov (United States)

    de Ruiter, Marleen C.; Ward, Philip J.; Daniell, James E.; Aerts, Jeroen C. J. H.

    2017-07-01

    In a cross-disciplinary study, we carried out an extensive literature review to increase understanding of vulnerability indicators used in the disciplines of earthquake- and flood vulnerability assessments. We provide insights into potential improvements in both fields by identifying and comparing quantitative vulnerability indicators grouped into physical and social categories. Next, a selection of index- and curve-based vulnerability models that use these indicators are described, comparing several characteristics such as temporal and spatial aspects. Earthquake vulnerability methods traditionally have a strong focus on object-based physical attributes used in vulnerability curve-based models, while flood vulnerability studies focus more on indicators applied to aggregated land-use classes in curve-based models. In assessing the differences and similarities between indicators used in earthquake and flood vulnerability models, we only include models that separately assess either of the two hazard types. Flood vulnerability studies could be improved using approaches from earthquake studies, such as developing object-based physical vulnerability curve assessments and incorporating time-of-the-day-based building occupation patterns. Likewise, earthquake assessments could learn from flood studies by refining their selection of social vulnerability indicators. Based on the lessons obtained in this study, we recommend future studies for exploring risk assessment methodologies across different hazard types.

  6. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Stephenson, D.E.; Zandt, G.; Bouchon, M.; Hustrulid, W.A.

    1980-01-01

    In order to assess the seismic risk for an underground facility, a data base was established and analyzed to evaluate the potential for seismic disturbance. Substantial damage to underground facilities is usually the result of displacements primarily along pre-existing faults and fractures, or at the surface entrance to these facilities. Evidence of this comes from both earthquakes and large explosions. Therefore, the displacement due to earthquakes as a function of depth is important in the evaluation of the hazard to underground facilities. To evaluate potential displacements due to seismic effects of block motions along pre-existing or induced fractures, the displacement fields surrounding two types of faults were investigated. Analytical models were used to determine relative displacements of shafts and near-surface displacement of large rock masses. Numerical methods were used to determine the displacement fields associated with pure strike-slip and vertical normal faults. Results are presented as displacements for various fault lengths as a function of depth and distance. This provides input to determine potential displacements in terms of depth and distance for underground facilities, important for assessing potential sites and design parameters

  7. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    Science.gov (United States)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  8. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  9. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    Science.gov (United States)

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  10. 78 FR 64973 - Scientific Earthquake Studies Advisory Committee (SESAC)

    Science.gov (United States)

    2013-10-30

    ... warning and national earthquake hazard mapping. Meetings of the Scientific Earthquake Studies Advisory... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] Scientific Earthquake Studies... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next...

  11. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  12. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  13. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  14. Strong motion duration and earthquake magnitude relationships

    Energy Technology Data Exchange (ETDEWEB)

    Salmon, M.W.; Short, S.A. [EQE International, Inc., San Francisco, CA (United States); Kennedy, R.P. [RPK Structural Mechanics Consulting, Yorba Linda, CA (United States)

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ``strong motion duration`` has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions.

  15. Stress triggering and the Canterbury earthquake sequence

    Science.gov (United States)

    Steacy, Sandy; Jiménez, Abigail; Holden, Caroline

    2014-01-01

    The Canterbury earthquake sequence, which includes the devastating Christchurch event of 2011 February, has to date led to losses of around 40 billion NZ dollars. The location and severity of the earthquakes was a surprise to most inhabitants as the seismic hazard model was dominated by an expected Mw > 8 earthquake on the Alpine fault and an Mw 7.5 earthquake on the Porters Pass fault, 150 and 80 km to the west of Christchurch. The sequence to date has included an Mw = 7.1 earthquake and 3 Mw ≥ 5.9 events which migrated from west to east. Here we investigate whether the later events are consistent with stress triggering and whether a simple stress map produced shortly after the first earthquake would have accurately indicated the regions where the subsequent activity occurred. We find that 100 per cent of M > 5.5 earthquakes occurred in positive stress areas computed using a slip model for the first event that was available within 10 d of its occurrence. We further find that the stress changes at the starting points of major slip patches of post-Darfield main events are consistent with triggering although this is not always true at the hypocentral locations. Our results suggest that Coulomb stress changes contributed to the evolution of the Canterbury sequence and we note additional areas of increased stress in the Christchurch region and on the Porters Pass fault.

  16. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  17. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  18. Losses Associated with Secondary Effects in Earthquakes

    Directory of Open Access Journals (Sweden)

    James E. Daniell

    2017-06-01

    Full Text Available The number of earthquakes with high damage and high losses has been limited to around 100 events since 1900. Looking at historical losses from 1900 onward, we see that around 100 key earthquakes (or around 1% of damaging earthquakes have caused around 93% of fatalities globally. What is indeed interesting about this statistic is that within these events, secondary effects have played a major role, causing around 40% of economic losses and fatalities as compared to shaking effects. Disaggregation of secondary effect economic losses and fatalities demonstrating the relative influence of historical losses from direct earthquake shaking in comparison to tsunami, fire, landslides, liquefaction, fault rupture, and other type losses is important if we are to understand the key causes post-earthquake. The trends and major event impacts of secondary effects are explored in terms of their historic impact as well as looking to improved ways to disaggregate them through two case studies of the Tohoku 2011 event for earthquake, tsunami, liquefaction, fire, and the nuclear impact; as well as the Chilean 1960 earthquake and tsunami event.

  19. A critical history of British earthquakes

    Directory of Open Access Journals (Sweden)

    R. M. W. Musson

    2004-06-01

    Full Text Available This paper reviews the history of the study of historical British earthquakes. The publication of compendia of British earthquakes goes back as early as the late 16th Century. A boost to the study of earthquakes in Britain was given in the mid 18th Century as a result of two events occurring in London in 1750 (analogous to the general increase in earthquakes in Europe five years later after the 1755 Lisbon earthquake. The 19th Century saw a number of significant studies, culminating in the work of Davison, whose book-length catalogue was published finally in 1924. After that appears a gap, until interest in the subject was renewed in the mid 1970s. The expansion of the U.K. nuclear programme in the 1980s led to a series of large-scale investigations of historical British earthquakes, all based almost completely on primary historical data and conducted to high standards. The catalogue published by BGS in 1994 is a synthesis of these studies, and presents a parametric catalogue in which historical earthquakes are assessed from intensity data points based on primary source material. Since 1994, revisions to parameters have been minor and new events discovered have been restricted to a few small events.

  20. On seismic intensities of questionnaires for 1996 earthquake near Akita-Miyagi prefecture

    Energy Technology Data Exchange (ETDEWEB)

    Nogoshi, M.; Sasaki, N. [Akita University, Akita (Japan). College of Education; Nakamura, M. [Nippon Geophysical Prospecting Co. Ltd., Tokyo (Japan)

    1997-05-27

    The earthquake occurred in 1996 near the border of Akita and Miyagi Prefectures was a seismic activity in mountainous area with low population density. However, since a necessity was felt to make a seismic intensity survey, a questionnaire investigation was carried out. The investigation placed a focus on the following points: (1) to learn seismic intensity distribution in the vicinity of the epicenter by using replies to the questionnaire and (2) to learn what evacuation activities the residents have taken to avoid disasters from the earthquake, which is an inland local earthquake occurred first since the Hyogoken-nanbu earthquake in 1995. Because the main shock has occurred in the Akita prefecture side, the shocks were concentrated at Akinomiya, Takamatsu, Sugawa and Koyasu areas where the intensities were 4.0 to 4.5 in most cases. The largest aftershocks were concentrated to the Miyagi prefecture side, with an intensity of 6.0 felt most, followed by 5.5. The questionnaire on evacuation actions revealed a result of about 37% of the reply saying, ``I have jumped out of my house before I knew what has happened`` and ``I remember nothing about what I did because I was acting totally instinctively``. The answers show how intense the experience was. This result indicates how to make the unconscious actions turned into conscious actions is an important issue in preventing disasters. 11 figs.

  1. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    Science.gov (United States)

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  2. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  3. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  4. ASSESSMENT OF EARTHQUAKE HAZARDS ON WASTE LANDFILLS

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  5. Sismosima: A pioneer project for earthquake detection

    International Nuclear Information System (INIS)

    Echague, C. de

    2015-01-01

    Currently you can only study how earthquakes occur and minimizing their consequences, but in Sismosima are studied earthquakes for if possible issue a pre-alert. Geological and Mining Institute of Spain (IGME) launched this project that has already achieved in test the caves in which you installed meters an increase of carbon dioxide (CO 2 ) that match the shot earthquake. Now, it remains check if gas emission occurs simultaneously, before or after. If were before, a couple of minutes would be enough to give an early warning with which save lives and ensure facilities. (Author)

  6. Wave-equation Based Earthquake Location

    Science.gov (United States)

    Tong, P.; Yang, D.; Yang, X.; Chen, J.; Harris, J.

    2014-12-01

    Precisely locating earthquakes is fundamentally important for studying earthquake physics, fault orientations and Earth's deformation. In industry, accurately determining hypocenters of microseismic events triggered in the course of a hydraulic fracturing treatment can help improve the production of oil and gas from unconventional reservoirs. We develop a novel earthquake location method based on solving full wave equations to accurately locate earthquakes (including microseismic earthquakes) in complex and heterogeneous structures. Traveltime residuals or differential traveltime measurements with the waveform cross-correlation technique are iteratively inverted to obtain the locations of earthquakes. The inversion process involves the computation of the Fréchet derivative with respect to the source (earthquake) location via the interaction between a forward wavefield emitting from the source to the receiver and an adjoint wavefield reversely propagating from the receiver to the source. When there is a source perturbation, the Fréchet derivative not only measures the influence of source location but also the effects of heterogeneity, anisotropy and attenuation of the subsurface structure on the arrival of seismic wave at the receiver. This is essential for the accuracy of earthquake location in complex media. In addition, to reduce the computational cost, we can first assume that seismic wave only propagates in a vertical plane passing through the source and the receiver. The forward wavefield, adjoint wavefield and Fréchet derivative with respect to the source location are all computed in a 2D vertical plane. By transferring the Fréchet derivative along the horizontal direction of the 2D plane into the ones along Latitude and Longitude coordinates or local 3D Cartesian coordinates, the source location can be updated in a 3D geometry. The earthquake location obtained with this combined 2D-3D approach can then be used as the initial location for a true 3D wave

  7. Earthquake consequences and measures for reduction of seismic risk.

    Science.gov (United States)

    Jurukovski, D

    1997-09-01

    Earthquakes are one of the most destructive of all natural disasters. This article discusses the consequences of earthquakes on material property. In addition, measures for the control and reduction of the consequences of earthquakes are described. Emphasis is placed on appropriate preparation by the general population and the need for a rapid and efficient response of governmental agencies. Finally, the experience of the staff of the Institute of Earthquake Engineering and Engineering Seismology in minimizing the consequences of earthquakes is described.

  8. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  9. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  10. Source of 1629 Banda Mega-Thrust Earthquake and Tsunami: Implications for Tsunami Hazard Evaluation in Eastern Indonesia

    Science.gov (United States)

    Major, J. R.; Liu, Z.; Harris, R. A.; Fisher, T. L.

    2011-12-01

    in 1629 to the Seram and Timor Troughs. For the Seram Trough source a Mw 8.8 produces run-up heights in the Banda Islands of 15.5 m with an arrival time of 17 minuets. For a Timor Trough earthquake near the Tanimbar Islands a Mw 9.2 is needed to produce a 15 m run-up height with an arrival time of 25 minuets. The main problem with the Timor Trough source is that it predicts run-up heights in Ambon of 10 m, which would likely have been recorded. Therefore, we conclude that the most likely source of the 1629 mega-thrust earthquake is the Seram Trough. No large earthquakes are reported along the Seram Trough for over 200 years although high rates of strain are measured across it. This study suggests that the earthquake triggers from this fault zone could be extremely devastating to Eastern Indonesia. We strive to raise the awareness to the local government to not underestimate the natural hazard of this region based on lessons learned from the 2004 Sumatra and 2011 Tohoku tsunamigenic mega-thrust earthquakes.

  11. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  12. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  13. Classrooms without Borders: New Spaces and Places of Learning

    Science.gov (United States)

    Pawson, Eric

    2016-01-01

    This article identifies what can be learned from seeking to adapt teaching and learning styles in a post-disaster environment. It focuses on the development of student research through community-based learning as a means of increasing engagement and contributing to recovery in an earthquake-damaged city. It urges consideration of the…

  14. Can mine tremors be predicted? Observational studies of earthquake nucleation, triggering and rupture in South African mines

    CSIR Research Space (South Africa)

    Durrheim, RJ

    2012-05-01

    Full Text Available -related stresses are likely to induce significant seismic activity; (2) to learn more about earthquake rupture and damage phenomena by deploying strong ground motion sensors close to potential rupture zones and on the walls of stopes; and (3) to upgrade the South...

  15. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  16. DYFI data for Induced Earthquake Studies

    Data.gov (United States)

    Department of the Interior — The significant rise in seismicity rates in Oklahoma and Kansas (OK–KS) in the last decade has led to an increased interest in studying induced earthquakes. Although...

  17. Coping with earthquakes induced by fluid injection

    Science.gov (United States)

    McGarr, Arthur F.; Bekins, Barbara; Burkardt, Nina; Dewey, James W.; Earle, Paul S.; Ellsworth, William L.; Ge, Shemin; Hickman, Stephen H.; Holland, Austin F.; Majer, Ernest; Rubinstein, Justin L.; Sheehan, Anne

    2015-01-01

    Large areas of the United States long considered geologically stable with little or no detected seismicity have recently become seismically active. The increase in earthquake activity began in the mid-continent starting in 2001 (1) and has continued to rise. In 2014, the rate of occurrence of earthquakes with magnitudes (M) of 3 and greater in Oklahoma exceeded that in California (see the figure). This elevated activity includes larger earthquakes, several with M > 5, that have caused significant damage (2, 3). To a large extent, the increasing rate of earthquakes in the mid-continent is due to fluid-injection activities used in modern energy production (1, 4, 5). We explore potential avenues for mitigating effects of induced seismicity. Although the United States is our focus here, Canada, China, the UK, and others confront similar problems associated with oil and gas production, whereas quakes induced by geothermal activities affect Switzerland, Germany, and others.

  18. Masonry infill performance during the Northridge earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Flanagan, R.D. [Lockheed Martin Energy Systems, Oak Ridge, TN (United States); Bennett, R.M.; Fischer, W.L. [Univ. of Tennesee, Knoxville, TN (United States); Adham, S.A. [Agbabian Associates, Pasadena, CA (United States)

    1996-03-08

    The response of masonry infills during the 1994 Northridge, California earthquake is described in terms of three categories: (1) lowrise and midrise structures experiencing large near field seismic excitations, (2) lowrise and midrise structures experiencing moderate far field excitation, and (3) highrise structures experiencing moderate far field excitation. In general, the infills provided a positive beneficial effect on the performance of the buildings, even those experiencing large peak accelerations near the epicenter. Varying types of masonry infills, structural frames, design conditions, and construction deficiencies were observed and their performance during the earthquake indicated. A summary of observations of the performance of infills in other recent earthquakes is given. Comparison with the Northridge earthquake is made and expected response of infill structures in lower seismic regions of the central and eastern United States is discussed.

  19. SHOCK WAVE IN IONOSPHERE DURING EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    V.V. Kuznetsov

    2016-11-01

    Full Text Available Fundamentally new model of the shock wave (SW generation in atmosphere and ionosphere during earthquake is proposed. The model proceeds from the idea of cooperative shock water crystallization in a cloud

  20. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  1. NGA Nepal Earthquake Support Data Services

    Data.gov (United States)

    National Geospatial Intelligence Agency — In support of the Spring 2015 Nepal earthquake response, NGA is providing to the public and humanitarian disaster response community these Nepal data services. They...

  2. The 15 April 1909 Taipei Earthquake

    Directory of Open Access Journals (Sweden)

    Jeen-Hwa Wang

    2011-01-01

    Full Text Available In the very early morning at 03 h 53.7 m on 15 April 1909 (local time, a large earthquake occurred in northern Taiwan. In all, 9 persons were killed and 51 injured; 122 houses collapsed along with damage to another 1050 houses. This earthquake was one of the largest and most damaging events of the 20th century for the Taipei Metropolitan Area. The epicenter estimated by Hsu (1971 was determined to be 25¢XN, 121.53¢XE and its focal depth and earthquake magnitude evaluated by Gutenberg and Richter (1954 were ~80 km and MGR = 7.3, respectively. The event took place underneath the Taipei Metropolitan Area and might be located at the western edge of the subduction zone of the Philippine Sea plate. In this study, the magnitudes of the earthquakes determined by others will also be described.

  3. Drinking Water Earthquake Resilience Paper Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data for the 9 figures contained in the paper, A SOFTWARE FRAMEWORK FOR ASSESSING THE RESILIENCE OF DRINKING WATER SYSTEMS TO DISASTERS WITH AN EXAMPLE EARTHQUAKE...

  4. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  5. Constraining subducted slab properties with deep earthquakes

    Science.gov (United States)

    Zhan, Z.; Yang, T.; Gurnis, M.; Shen, Z.; Wu, F.

    2017-12-01

    The discovery of deep earthquakes and Wadati-Benioff zone was a critical piece in the early history of plate tectonics. Today, deep earthquakes continue to serve as important markers/probes of subducted slab geometry, structure, and stress state. Here we discuss three examples in which we have recently used deep earthquakes to provide new insights to subducted slab properties. In the first application, we investigate the slab morphology and stress regimes under different trench motion histories with geodynamic models. We find that the isolation of the 2015 Mw 7.9 Bonin Islands deep earthquake from the background Wadati-Benioff zone may be explained as a result of Pacific slab buckling in response to the slow trench retreat. Additionally, subducted slab is inherently heterogeneous due to non-linear viscosity, contributing to the occurrences of isolated deep earthquakes. In the second application, we quantify the coda waveform differences from nearby deep earthquakes to image fine-scale slab structures. We find that large metastable olivine wedge suggested by several previous studies can not fit our observations. Therefore, the effects of metastable olivine on slab dynamics should be re-assessed. In the third application, we take advantage of P and S differential travel times from deep earthquake clusters to isolate signatures of Vp/Vs ratios within slabs from ambient mantle. We observe substantial deviations of slab Vp/Vs from that in 1D reference Earth models, and even possible lateral variations. This sheds light on potential difference in slab temperature or water content. All three applications underscore that deep earthquakes are still incredibly useful in informing us more about subducted slabs.

  6. Natural Gas Extraction, Earthquakes and House Prices

    OpenAIRE

    Hans R.A. Koster; Jos N. van Ommeren

    2015-01-01

    The production of natural gas is strongly increasing around the world. Long-run negative external effects of extraction are understudied and often ignored in social) cost-benefit analyses. One important example is that natural gas extraction leads to soil subsidence and subsequent induced earthquakes that may occur only after a couple of decades. We show that induced earthquakes that are noticeable to residents generate substantial non-monetary economic effects, as measured by their effects o...

  7. Earthquake geology of the Bulnay Fault (Mongolia)

    Science.gov (United States)

    Rizza, Magali; Ritz, Jean-Franciois; Prentice, Carol S.; Vassallo, Ricardo; Braucher, Regis; Larroque, Christophe; Arzhannikova, A.; Arzhanikov, S.; Mahan, Shannon; Massault, M.; Michelot, J-L.; Todbileg, M.

    2015-01-01

    The Bulnay earthquake of July 23, 1905 (Mw 8.3-8.5), in north-central Mongolia, is one of the world's largest recorded intracontinental earthquakes and one of four great earthquakes that occurred in the region during the 20th century. The 375-km-long surface rupture of the left-lateral, strike-slip, N095°E trending Bulnay Fault associated with this earthquake is remarkable for its pronounced expression across the landscape and for the size of features produced by previous earthquakes. Our field observations suggest that in many areas the width and geometry of the rupture zone is the result of repeated earthquakes; however, in those areas where it is possible to determine that the geomorphic features are the result of the 1905 surface rupture alone, the size of the features produced by this single earthquake are singular in comparison to most other historical strike-slip surface ruptures worldwide. Along the 80 km stretch, between 97.18°E and 98.33°E, the fault zone is characterized by several meters width and the mean left-lateral 1905 offset is 8.9 ± 0.6 m with two measured cumulative offsets that are twice the 1905 slip. These observations suggest that the displacement produced during the penultimate event was similar to the 1905 slip. Morphotectonic analyses carried out at three sites along the eastern part of the Bulnay fault, allow us to estimate a mean horizontal slip rate of 3.1 ± 1.7 mm/yr over the Late Pleistocene-Holocene period. In parallel, paleoseismological investigations show evidence for two earthquakes prior to the 1905 event with recurrence intervals of ~2700-4000 years.

  8. Evaluation of near-field earthquake effects

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  9. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  11. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  12. Relationship of heat and cold to earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.

    1980-06-26

    An analysis of 54 earthquakes of magnitude 7 and above, including 13 of magnitude 8 and above, between 780 BC and the present, shows that the vast majority of them fell in the four major cool periods during this time span, or on the boundaries of these periods. Between 1800 and 1876, four periods of earthquake activity in China can be recognized, and these tend to correspond to relatively cold periods over that time span. An analysis of earthquakes of magnitude 6 or above over the period 1951 to 1965 gives the following results: earthquakes in north and southwest China tended to occur when the preceding year had an above-average annual temperature and winter temperature; in the northeast they tended to occur in a year after a year with an above-average winter temperature; in the northwest there was also a connection with a preceding warm winter, but to a less pronounced degree. The few earthquakes in South China seemed to follow cold winters. Both the Tangshan and Yongshan Pass earthquakes were preceded by unusually warm years and relatively high winter temperatures.

  13. Do weak global stresses synchronize earthquakes?

    Science.gov (United States)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  14. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    Directory of Open Access Journals (Sweden)

    J. G. Williams

    2018-01-01

    Full Text Available Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  15. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    Science.gov (United States)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  16. My Road to Transform Faulting 1963; Long-Term Precursors to Recent Great Earthquakes

    Science.gov (United States)

    Sykes, L. R.

    2017-12-01

    My road to plate tectonics started serendipitously in 1963 in a remote area of the southeast Pacific when I was studying the propagation of short-period seismic surface waves for my PhD. The earthquakes I used as sources were poorly located. I discovered that my relocated epicenters followed the crest of the East Pacific Rise but then suddenly took a sharp turn to the east at what I interpreted to be a major fracture zone 1000 km long before turning again to the north near 55 degrees south. I noted that earthquakes along that zone only occurred between the two ridge crests, an observation Tuzo Wilson used to develop his hypothesis of transform faulting. Finding a great, unknown fracture zone led me to conclude that work on similar faults that intersect the Mid-Oceanic Ridge System was more important than my study of surface waves. I found similar great faults over the next two years and obtained refined locations of earthquakes along several island arcs. When I was in Fiji and Tonga during 1965 studying deep earthquakes, James Dorman wrote to me about Wilson's paper and I thought about testing his hypothesis. I started work on it the spring of 1966 immediately after I learned about the symmetrical "magic magnetic anomaly profile" across the East Pacific Rise of Pitman and Heirtzler. I quickly obtained earthquake mechanisms that verified the transform hypothesis and its related concepts of seafloor spreading and continental drift. As an undergraduate in the late 1950s, my mentor told me that respectable young earth scientists should not work on vague and false mobilistic concepts like continental drift since continents cannot plow through strong oceanic crust. Hence, until spring 1966, I did not take continental drift seriously. The second part of my presentation involves new evidence from seismology and GPS of what appear to be long-term precursors to a number of great earthquakes of the past decade.

  17. Jumping over the hurdles to effectively communicate the Operational Earthquake Forecast

    Science.gov (United States)

    McBride, S.; Wein, A. M.; Becker, J.; Potter, S.; Tilley, E. N.; Gerstenberger, M.; Orchiston, C.; Johnston, D. M.

    2016-12-01

    Probabilities, uncertainties, statistics, science, and threats are notoriously difficult topics to communicate with members of the public. The Operational Earthquake Forecast (OEF) is designed to provide an understanding of potential numbers and sizes of earthquakes and the communication of it must address all of those challenges. Furthermore, there are other barriers to effective communication of the OEF. These barriers include the erosion of trust in scientists and experts, oversaturation of messages, fear and threat messages magnified by the sensalisation of the media, fractured media environments and online echo chambers. Given the complexities and challenges of the OEF, how can we overcome barriers to effective communication? Crisis and risk communication research can inform the development of communication strategies to increase the public understanding and use of the OEF, when applied to the opportunities and challenges of practice. We explore ongoing research regarding how the OEF can be more effectively communicated - including the channels, tools and message composition to engage with a variety of publics. We also draw on past experience and a study of OEF communication during the Canterbury Earthquake Sequence (CES). We demonstrate how research and experience has guided OEF communications during subsequent events in New Zealand, including the M5.7 Valentine's Day earthquake in 2016 (CES), M6.0 Wilberforce earthquake in 2015, and the Cook Strait/Lake Grassmere earthquakes in 2013. We identify the successes and lessons learned of the practical communication of the OEF. Finally, we present future projects and directions in the communication of OEF, informed by both practice and research.

  18. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  19. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  20. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    Science.gov (United States)

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  1. The Nankai Trough earthquake tsunamis in Korea: Numerical studies of the 1707 Hoei earthquake and physics-based scenarios

    Science.gov (United States)

    Kim, S.; Saito, T.; Fukuyama, E.; Kang, T. S.

    2016-12-01

    Historical documents in Korea and China report abnormal waves in the sea and rivers close to the date of the 1707 Hoei earthquake, which occurred in the Nankai Trough, off southwestern Japan. This indicates that the tsunami caused by the Hoei earthquake might have reached Korea and China, which suggests a potential hazard in Korea from large earthquakes in the Nankai Trough. We conducted tsunami simulations to study the details of tsunamis in Korea caused by large earthquakes. We employed the 1707 Hoei earthquake source model and physics-based scenarios of anticipated earthquake in the Nankai subduction zone. We also considered the effect of horizontal displacement on tsunami generation. Our simulation results from the Hoei earthquake model and the anticipated earthquake models showed that the maximum tsunami height along the Korean coast was less than 0.5 m. Even though the tsunami is not life-threatening, the effect of larger earthquakes should be still considered.

  2. An integrated digital system for earthquake damage reconnaissance

    Science.gov (United States)

    Deaton, Scott Lowrey

    surrounding buildings that suffered collateral damage as the towers collapsed. PQuake provides the ability to obtain damage data that is comprehensive and accurate. In order to learn as much as possible from catastrophic events, civil engineers must adopt new technologies and incorporate new reconnaissance protocols. This dissertation presents the development of an integrated digital system for earthquake damage reconnaissance that serves as a tool and a means for implementing the reconnaissance procedures.

  3. Integrating Real-time Earthquakes into Natural Hazard Courses

    Science.gov (United States)

    Furlong, K. P.; Benz, H. M.; Whitlock, J. S.; Bittenbinder, A. N.; Bogaert, B. B.

    2001-12-01

    Natural hazard courses are playing an increasingly important role in college and university earth science curricula. Students' intrinsic curiosity about the subject and the potential to make the course relevant to the interests of both science and non-science students make natural hazards courses popular additions to a department's offerings. However, one vital aspect of "real-life" natural hazard management that has not translated well into the classroom is the real-time nature of both events and response. The lack of a way to entrain students into the event/response mode has made implementing such real-time activities into classroom activities problematic. Although a variety of web sites provide near real-time postings of natural hazards, students essentially learn of the event after the fact. This is particularly true for earthquakes and other events with few precursors. As a result, the "time factor" and personal responsibility associated with natural hazard response is lost to the students. We have integrated the real-time aspects of earthquake response into two natural hazard courses at Penn State (a 'general education' course for non-science majors, and an upper-level course for science majors) by implementing a modification of the USGS Earthworm system. The Earthworm Database Management System (E-DBMS) catalogs current global seismic activity. It provides earthquake professionals with real-time email/cell phone alerts of global seismic activity and access to the data for review/revision purposes. We have modified this system so that real-time response can be used to address specific scientific, policy, and social questions in our classes. As a prototype of using the E-DBMS in courses, we have established an Earthworm server at Penn State. This server receives national and global seismic network data and, in turn, transmits the tailored alerts to "on-duty" students (e-mail, pager/cell phone notification). These students are responsible to react to the alarm

  4. Should Fermi Have Secured his Water Heater Against Earthquakes?

    Science.gov (United States)

    Brooks, E. M.; Diggory, M.; Gomez, E.; Salaree, A.; Schmid, M.; Saloor, N.; Stein, S. A.

    2015-12-01

    A common student response to quantitative questions in science with no obvious answer is "I have no idea." Often these questions can be addressed by Fermi estimation, in which an apparently difficult-to-estimate quantity for which one has little intuitive sense can be sensibly estimated by combining order of magnitude estimates of easier-to-estimate quantities. Although this approach is most commonly used for numerical estimates, it can also be applied to issues combining both science and policy. Either application involves dividing an issue into tractable components and addressing them separately. To learn this method, our natural hazard policy seminar considered a statement by the Illinois Emergency Management Agency that homeowners should secure water heaters to prevent them from being damaged by earthquakes. We divided this question into subtopics, researched each, and discussed them weekly to reach a synthesis. We used a simple model to estimate the net benefit, the difference between the expected value of damage and the cost of securing a water heater. This benefit is positive, indicating that securing is worthwhile, only if the probability of damage during the heater's life is relatively large, approximately 1 - 10%. To assess whether the actual probability is likely to be this high, we assume that major water heater damage is likely only for shaking with MMI intensity VIII ("heavy furniture overturned") or greater. Intensity data for the past 200 years of Illinois earthquakes show that this level was reached only in the very southernmost part of the state for the 1811-1812 New Madrid earthquakes. As expected, the highest known shaking generally decreases northward toward Chicago. This history is consistent with the fact that we find no known cases of earthquake-toppled water heaters in Illinois. We compared the rate of return on securing a water heater in Chicago to buying a lottery ticket when the jackpot is large, and found that the latter would be a

  5. Soil structure interactions of eastern U.S. type earthquakes

    International Nuclear Information System (INIS)

    Chang Chen; Serhan, S.

    1991-01-01

    Two types of earthquakes have occurred in the eastern US in the past. One of them was the infrequent major events such as the 1811-1812 New Madrid Earthquakes, or the 1886 Charleston Earthquake. The other type was the frequent shallow earthquakes with high frequency, short duration and high accelerations. Two eastern US nuclear power plants, V.C Summer and Perry, went through extensive licensing effort to obtain fuel load licenses after this type of earthquake was recorded on sites and exceeded the design bases beyond 10 hertz region. This paper discusses the soil-structure interactions of the latter type of earthquakes

  6. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  7. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  8. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  9. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    Science.gov (United States)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  10. Earthquake response of inelastic structures

    International Nuclear Information System (INIS)

    Parulekar, Y.M.; Vaity, K.N.; Reddy, .R.; Vaze, K.K.; Kushwaha, H.S.

    2004-01-01

    The most commonly used method in the seismic analysis of structures is the response spectrum method. For seismic re-evaluation of existing facilities elastic response spectrum method cannot be used directly as large deformation above yield may be observed under Safe Shutdown Earthquake (SSE). The plastic deformation, i.e. hysteretic characteristics of various elements of the structure cause dissipation of energy. Hence the values of damping given by the code, which does not account hysteretic energy dissipation cannot be directly used. In this paper, appropriate damping values are evaluated for 5-storey, 10-storey and 15-storey shear beam structures, which deform beyond their yield limit. Linear elastic analysis is performed for the same structures using these damping values and the storey forces are compared with those obtained using inelastic time history analysis. A damping model, which relates ductility of the structure and damping, is developed. Using his damping model, a practical structure is analysed and results are compared with inelastic time history analysis and the comparison is found to be good

  11. Development of an Earthquake Impact Scale

    Science.gov (United States)

    Wald, D. J.; Marano, K. D.; Jaiswal, K. S.

    2009-12-01

    With the advent of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, domestic (U.S.) and international earthquake responders are reconsidering their automatic alert and activation levels as well as their response procedures. To help facilitate rapid and proportionate earthquake response, we propose and describe an Earthquake Impact Scale (EIS) founded on two alerting criteria. One, based on the estimated cost of damage, is most suitable for domestic events; the other, based on estimated ranges of fatalities, is more appropriate for most global events. Simple thresholds, derived from the systematic analysis of past earthquake impact and response levels, turn out to be quite effective in communicating predicted impact and response level of an event, characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses exceeding 1M, 10M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness dominate in countries where vernacular building practices typically lend themselves to high collapse and casualty rates, and it is these impacts that set prioritization for international response. In contrast, it is often financial and overall societal impacts that trigger the level of response in regions or countries where prevalent earthquake resistant construction practices greatly reduce building collapse and associated fatalities. Any newly devised alert protocols, whether financial or casualty based, must be intuitive and consistent with established lexicons and procedures. In this analysis, we make an attempt

  12. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  13. Fractals and Forecasting in Earthquakes and Finance

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  14. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  15. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  16. The Napa (California, US) earthquake of 24 August 2014 (10.24 UT) Magnitude = 6.0

    International Nuclear Information System (INIS)

    Scotti, Oona

    2014-01-01

    This publication briefly presents the characteristics of an earthquake which occurred in California in August 2014, indicates some data recorded by local seismic stations, and gives a brief overview of human and economic damages. It analyses the geological location of the earthquake, recalls previous events and outlines the local seismic risk. After having noticed that there was no consequence for the closest nuclear power station (300 km away), it indicates lessons learned in terms of seismic event about a crack, in order to better assess the risk of surface failure

  17. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  18. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    Science.gov (United States)

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  19. Educator professional development as a component of earthquake and tsunami readiness and early warning systems

    Science.gov (United States)

    Pratt-Sitaula, B. A.; Butler, R. F.; Lillie, R. J.; Hunter, N.; Magura, B.; Groom, R.; Hedeen, C.; Johnson, J. A.; Olds, S. E.; Charlevoix, D.; Coe, M.

    2014-12-01

    activities that they and their learners had undertaken related to earthquake and tsunami science and preparedness. Thousands of students, park visitors, and community members have subsequently learned Cascadia-specific earthquake and tsunami science and preparedness from CEETEP participant educators.

  20. Characteristics of Large Earthquakes Occurring on the Shallowest Portion of the Mexican Subduction Megathrust

    Science.gov (United States)

    Hjorleifsdottir, V.; Flores, K.; Singh, S. K.; Iglesias, A.; Castillo, J.; Vallee, M.; Perez-Campos, X.; Ji, C.

    2017-12-01

    The large magnitude and enormous tsunami of the 2011 Tohoku earthquake came as a surprise to the seismological community. An event of this magnitude had not been documented in the zone for more than 1000 years, and the events in the previous 100 years were of Mw 7.8 or smaller. In particular, the very large slip of up to 60 m near the trench [Ito et al., 2011; Kido et al., 2011; Sato et al., 2011], was unexpected. Similarly, the Mexican subduction interface has repeatedly ruptured in magnitude Mw 7-8 earthquakes in the last 100 years [e.g. Singh et al., 1981, Kostoglodov and Pacheco, 1999]. Most of the events have rupture areas centered on the coast, breaking a depth range from 10-30 km and not the shallowest, near trench portion. However, in 1787, an M 8.6 earthquake, broke the Oaxaca segment of the subduction zone, causing a tsunami that reached more than 5 km inland [Suarez and Albini, 2009], suggesting that the near trench area broke in this event. In order to learn more about the seismic behavior of the shallowest part of subduction interface, we have studied the four largest events that have been suggested to break that portion of the fault in Mexico; the Mw 8.0, 1995 Jalisco earthquake [Hjorleifsdottir et al, to be submitted], the Mw 6.7, 2002 Guerrero [Flores et al., to be submitted], and the Mw 7.2 and Mw 6.7, 1996 and 1997 Offshore Oaxaca earthquakes. We confirm that these events do break the shallowest part of the fault interface and observe that they do not have a smooth source time function as observed for tsunami earthquakes [Kanamori and Kikuchi, 1993; Polet and Thio, 2003], but a rugged one, suggesting a rupture of multiple separate asperities. The Jalisco event ruptured updip, with an average rupture velocity of 2.5 km/s, with the shallow rupture breaking much slower. The Guerrero event on the other hand, broke a 60 km long and 10-20 km wide section, in at least 2 asperities, with a rupture velocity around 1 km/s. The Mw 6.7 Offshore Oaxaca

  1. Applications of a table-top time-resolved luminescence spectrometer with nanosecond soft X-ray pulse excitation

    Czech Academy of Sciences Publication Activity Database

    Brůža, P.; Pánek, D.; Fidler, V.; Benedikt, P.; Čuba, V.; Gbur, T.; Boháček, Pavel; Nikl, Martin

    2014-01-01

    Roč. 61, č. 1 (2014), s. 448-451 ISSN 0018-9499 R&D Projects: GA ČR GA13-09876S Institutional support: RVO:68378271 Keywords : LiCaAlF 6 * luminescence * scintillators * soft x-ray * SrHfO 3 * time-resolved spectroscopy * ZnO:Ga Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.283, year: 2014

  2. After Action Report: Black Sea Initiative Table Top Exercise Albatross 2007 Batumi, Georgia, 12-15 February 2007

    Science.gov (United States)

    2007-08-01

    questions sent and the ECG took note (as elaborated upon in the TTX Mechanics section below). Language comprehension issues translate into a greater time...Emulsification: Wave action mixes water into the oil, forming a heavy and sticky water-in-oil emulsion, sometimes called chocolate mousse. 4. So

  3. Fabrication of a Highly Aligned Neural Scaffold via a Table Top Stereolithography 3D Printing and Electrospinning.

    Science.gov (United States)

    Lee, Se-Jun; Nowicki, Margaret; Harris, Brent; Zhang, Lijie Grace

    2017-06-01

    Three-dimensional (3D) bioprinting is a rapidly emerging technique in the field of tissue engineering to fabricate extremely intricate and complex biomimetic scaffolds in the range of micrometers. Such customized 3D printed constructs can be used for the regeneration of complex tissues such as cartilage, vessels, and nerves. However, the 3D printing techniques often offer limited control over the resolution and compromised mechanical properties due to short selection of printable inks. To address these limitations, we combined stereolithography and electrospinning techniques to fabricate a novel 3D biomimetic neural scaffold with a tunable porous structure and embedded aligned fibers. By employing two different types of biofabrication methods, we successfully utilized both synthetic and natural materials with varying chemical composition as bioink to enhance biocompatibilities and mechanical properties of the scaffold. The resulting microfibers composed of polycaprolactone (PCL) polymer and PCL mixed with gelatin were embedded in 3D printed hydrogel scaffold. Our results showed that 3D printed scaffolds with electrospun fibers significantly improve neural stem cell adhesion when compared to those without the fibers. Furthermore, 3D scaffolds embedded with aligned fibers showed an enhancement in cell proliferation relative to bare control scaffolds. More importantly, confocal microscopy images illustrated that the scaffold with PCL/gelatin fibers greatly increased the average neurite length and directed neurite extension of primary cortical neurons along the fiber. The results of this study demonstrate the potential to create unique 3D neural tissue constructs by combining 3D bioprinting and electrospinning techniques.

  4. Results from the second Galaxy Serpent web-based table top exercise utilizing the concept of nuclear forensics libraries

    International Nuclear Information System (INIS)

    Borgardt, James; Canaday, Jodi; Chamberlain, David

    2017-01-01

    Galaxy Serpent is a unique, virtual, web-based international tabletop series of exercises designed to mature the concept of National Nuclear Forensics Libraries (NNFLs). Teams participating in the second version of the exercise were provided synthetic sealed radioactive source data used to compile a model NNFL which then served as a comparative instrument in hypothetical scenarios involving sources out of regulatory control, allowing teams to successfully down-select and determine whether investigated sources were consistent with holdings in their model library. The methodologies utilized and aggregate results of the exercise will be presented, along with challenges encountered and benefits realized. (author)

  5. Economic consequences of earthquakes: bridging research and practice with HayWired

    Science.gov (United States)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  6. Cross-cultural comparisons between the earthquake preparedness models of Taiwan and New Zealand.

    Science.gov (United States)

    Jang, Li-Ju; Wang, Jieh-Jiuh; Paton, Douglas; Tsai, Ning-Yu

    2016-04-01

    Taiwan and New Zealand are both located in the Pacific Rim where 81 per cent of the world's largest earthquakes occur. Effective programmes for increasing people's preparedness for these hazards are essential. This paper tests the applicability of the community engagement theory of hazard preparedness in two distinct cultural contexts. Structural equation modelling analysis provides support for this theory. The paper suggests that the close fit between theory and data that is achieved by excluding trust supports the theoretical prediction that familiarity with a hazard negates the need to trust external sources. The results demonstrate that the hazard preparedness theory is applicable to communities that have previously experienced earthquakes and are therefore familiar with the associated hazards and the need for earthquake preparedness. The paper also argues that cross-cultural comparisons provide opportunities for collaborative research and learning as well as access to a wider range of potential earthquake risk management strategies. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  7. Seismic waveform classification using deep learning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2017-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It has an Artificial Neural Network (ANN) algorithm running on the phone to distinguish earthquake motion from human activities recorded by the accelerometer on board. Once the ANN detects earthquake-like motion, it sends a 5-min chunk of acceleration data back to the server for further analysis. The time-series data collected contains both earthquake data and human activity data that the ANN confused. In this presentation, we will show the Convolutional Neural Network (CNN) we built under the umbrella of supervised learning to find out the earthquake waveform. The waveforms of the recorded motion could treat easily as images, and by taking the advantage of the power of CNN processing the images, we achieved very high successful rate to select the earthquake waveforms out. Since there are many non-earthquake waveforms than the earthquake waveforms, we also built an anomaly detection algorithm using the CNN. Both these two methods can be easily extended to other waveform classification problems.

  8. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  9. Normal Fault Type Earthquakes Off Fukushima Region - Comparison of the 1938 Events and Recent Earthquakes -

    Science.gov (United States)

    Murotani, S.; Satake, K.

    2017-12-01

    Off Fukushima region, Mjma 7.4 (event A) and 6.9 (event B) events occurred on November 6, 1938, following the thrust fault type earthquakes of Mjma 7.5 and 7.3 on the previous day. These earthquakes were estimated as normal fault earthquakes by Abe (1977, Tectonophysics). An Mjma 7.0 earthquake occurred on July 12, 2014 near event B and an Mjma 7.4 earthquake occurred on November 22, 2016 near event A. These recent events are the only M 7 class earthquakes occurred off Fukushima since 1938. Except for the two 1938 events, normal fault earthquakes have not occurred until many aftershocks of the 2011 Tohoku earthquake. We compared the observed tsunami and seismic waveforms of the 1938, 2014, and 2016 earthquakes to examine the normal fault earthquakes occurred off Fukushima region. It is difficult to compare the tsunami waveforms of the 1938, 2014 and 2016 events because there were only a few observations at the same station. The teleseismic body wave inversion of the 2016 earthquake yielded with the focal mechanism of strike 42°, dip 35°, and rake -94°. Other source parameters were as follows: source area 70 km x 40 km, average slip 0.2 m, maximum slip 1.2 m, seismic moment 2.2 x 1019 Nm, and Mw 6.8. A large slip area is located near the hypocenter, and it is compatible with the tsunami source area estimated from tsunami travel times. The 2016 tsunami source area is smaller than that of the 1938 event, consistent with the difference in Mw: 7.7 for event A estimated by Abe (1977) and 6.8 for the 2016 event. Although the 2014 epicenter is very close to that of event B, the teleseismic waveforms of the 2014 event are similar to those of event A and the 2016 event. While Abe (1977) assumed that the mechanism of event B was the same as event A, the initial motions at some stations are opposite, indicating that the focal mechanisms of events A and B are different and more detailed examination is needed. The normal fault type earthquake seems to occur following the

  10. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  11. Electrostatically actuated resonant switches for earthquake detection

    KAUST Repository

    Ramini, Abdallah H.

    2013-04-01

    The modeling and design of electrostatically actuated resonant switches (EARS) for earthquake and seismic applications are presented. The basic concepts are based on operating an electrically actuated resonator close to instability bands of frequency, where it is forced to collapse (pull-in) if operated within these bands. By careful tuning, the resonator can be made to enter the instability zone upon the detection of the earthquake signal, thereby pulling-in as a switch. Such a switching action can be functionalized for useful functionalities, such as shutting off gas pipelines in the case of earthquakes, or can be used to activate a network of sensors for seismic activity recording in health monitoring applications. By placing a resonator on a printed circuit board (PCB) of a natural frequency close to that of the earthquake\\'s frequency, we show significant improvement on the detection limit of the EARS lowering it considerably to less than 60% of the EARS by itself without the PCB. © 2013 IEEE.

  12. Earthquake risk assessment of building structures

    International Nuclear Information System (INIS)

    Ellingwood, Bruce R.

    2001-01-01

    During the past two decades, probabilistic risk analysis tools have been applied to assess the performance of new and existing building structural systems. Structural design and evaluation of buildings and other facilities with regard to their ability to withstand the effects of earthquakes requires special considerations that are not normally a part of such evaluations for other occupancy, service and environmental loads. This paper reviews some of these special considerations, specifically as they pertain to probability-based codified design and reliability-based condition assessment of existing buildings. Difficulties experienced in implementing probability-based limit states design criteria for earthquake are summarized. Comparisons of predicted and observed building damage highlight the limitations of using current deterministic approaches for post-earthquake building condition assessment. The importance of inherent randomness and modeling uncertainty in forecasting building performance is examined through a building fragility assessment of a steel frame with welded connections that was damaged during the Northridge Earthquake of 1994. The prospects for future improvements in earthquake-resistant design procedures based on a more rational probability-based treatment of uncertainty are examined

  13. Roaming earthquakes in China highlight midcontinental hazards

    Science.gov (United States)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  14. Image Recognition Techniques for Earthquake Early Warning

    Science.gov (United States)

    Boese, M.; Heaton, T. H.; Hauksson, E.

    2011-12-01

    When monitoring on his/her PC a map of seismic stations, whose colors scale with the real-time transmitted ground motions amplitudes observed in a dense seismic network, an experienced person will fairly easily recognize when and where an earthquake occurs. Using the maximum amplitudes at stations at close epicentral distances, he/she might even be able to roughly estimate the size of the event. From the number and distribution of stations turning 'red', the person might also be able to recognize the rupturing fault in a large earthquake (M>>7.0), and to estimate the rupture dimensions while the rupture is still developing. Following this concept, we are adopting techniques for automatic image recognition to provide earthquake early warning. We rapidly correlate a set of templates with real-time ground motion observations in a seismic network. If a 'suspicious' pattern of ground motion amplitudes is detected, the algorithm starts estimating the location of the earthquake and its magnitude. For large earthquakes the algorithm estimates finite source dimensions and the direction of rupture propagation. These predictions are continuously up-dated using the current 'image' of ground motion observations. A priori information, such as on the orientation of mayor faults, helps enhancing estimates in less dense networks. The approach will be demonstrated for multiple simulated and real events in California.

  15. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  16. Bridge seismic retrofit measures considering subduction zone earthquakes.

    Science.gov (United States)

    2015-07-01

    Over the years, earthquakes have exposed the vulnerability of reinforced concrete structures under : seismic loads. The recent occurrence of highly devastating earthquakes near instrumented regions, e.g. 2010 Maule, Chile : and 2011 Tohoku, Japan, ha...

  17. United States Earthquake Intensity Database, 1638-1985

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The United States Earthquake Intensity Database is a collection of damage and felt reports for over 23,000 U.S. earthquakes from 1638-1985. The majority of...

  18. Tilt Precursors before Earthquakes on the San Andreas Fault, California.

    Science.gov (United States)

    Johnston, M J; Mortensen, C E

    1974-12-13

    An array of 14 biaxial shallow-borehole tiltmeters (at 1O(-7) radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (> 10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  19. Parent Guidelines for Helping Children After an Earthquake

    Science.gov (United States)

    ... NOW Parent Guidelines for Helping Children after an Earthquake You are here Home > Parent Guidelines for Helping ... top Parent Guidelines for Helping Children after an Earthquake NCTSN Resource Resource Description Offers parents guidance on ...

  20. 76 FR 18165 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2011-04-01

    ....S. Geological Survey (USGS) Scientific Earthquake Studies Advisory Committee (SESAC) serves in an ex... Earthquake Hazards Reduction Meeting AGENCY: National Institute of Standards and Technology, Department of Commerce. ACTION: Notice of open meeting. [[Page 18166