Sample records for rapid earthquake information

  1. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant


    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  2. Expanding the Delivery of Rapid Earthquake Information and Warnings for Response and Recovery (United States)

    Blanpied, M. L.; McBride, S.; Hardebeck, J.; Michael, A. J.; van der Elst, N.


    Scientific organizations like the United States Geological Survey (USGS) release information to support effective responses during an earthquake crisis. Information is delivered to the White House, the National Command Center, the Departments of Defense, Homeland Security (including FEMA), Transportation, Energy, and Interior. Other crucial stakeholders include state officials and decision makers, emergency responders, numerous public and private infrastructure management centers (e.g., highways, railroads and pipelines), the media, and the public. To meet the diverse information requirements of these users, rapid earthquake notifications have been developed to be delivered by e-mail and text message, as well as a suite of earthquake information resources such as ShakeMaps, Did You Feel It?, PAGER impact estimates, and data are delivered via the web. The ShakeAlert earthquake early warning system being developed for the U.S. West Coast will identify and characterize an earthquake a few seconds after it begins, estimate the likely intensity of ground shaking, and deliver brief but critically important warnings to people and infrastructure in harm's way. Currently the USGS is also developing a capability to deliver Operational Earthquake Forecasts (OEF). These provide estimates of potential seismic behavior after large earthquakes and during evolving aftershock sequences. Similar work is underway in New Zealand, Japan, and Italy. In the development of OEF forecasts, social science research conducted during these sequences indicates that aftershock forecasts are valued for a variety of reasons, from informing critical response and recovery decisions to psychologically preparing for more earthquakes. New tools will allow users to customize map-based, spatiotemporal forecasts to their specific needs. Hazard curves and other advanced information will also be available. For such authoritative information to be understood and used during the pressures of an earthquake

  3. The Benefits and Limitations of Crowdsourced Information for Rapid Damage Assessment of Global Earthquakes (United States)

    Bossu, R.; Landès, M.; Roussel, F.


    The Internet has fastened the collection of felt reports and macroseismic data after global earthquakes. At the European-Mediterranean Seismological Centre (EMSC), where the traditional online questionnaires have been replace by thumbnail-based questionnaires, an average of half of the reports are collected within 10 minutes of an earthquake's occurrence. In regions where EMSC is well identified this goes down to 5 min. The user simply specifies the thumbnail corresponding to observed effects erasing languages barriers and improving collection via small smartphone screens. A previous study has shown that EMSC data is well correlated with "Did You Feel It" (DYFI) data and 3 independent, manually collected datasets. The efficiency and rapidity of felt report collection through thumbnail-based questionnaires does not necessarily mean that they offer a complete picture of the situation for all intensities values, especially the higher ones. There are several potential limitations. Demographics probably play a role but so might eyewitnesses' behaviors: it is probably not their priority to report when their own safety and that of their loved ones is at stake. We propose to test this hypothesis on EMSC felt reports and to extend the study to LastQuake smartphone application uses. LastQuake is a free smartphone app providing very rapid information on felt earthquakes. There are currently 210 000 active users around the world covering almost every country except for a few ones in Sub-Saharan Africa. Along with felt reports we also analyze the characteristics of LastQuake app launches. For both composite datasets created from 108 earthquakes, we analyze the rapidity of eyewitnesses' reaction and how it changes with intensity values and surmise how they reflect different types of behaviors. We will show the intrinsic limitations of crowdsourced information for rapid situation awareness. More importantly, we will show in which cases the lack of crowdsourced information could

  4. Twitter as Information Source for Rapid Damage Estimation after Major Earthquakes (United States)

    Eggert, Silke; Fohringer, Joachim


    Natural disasters like earthquakes require a fast response from local authorities. Well trained rescue teams have to be available, equipment and technology has to be ready set up, information have to be directed to the right positions so the head quarter can manage the operation precisely. The main goal is to reach the most affected areas in a minimum of time. But even with the best preparation for these cases, there will always be the uncertainty of what really happened in the affected area. Modern geophysical sensor networks provide high quality data. These measurements, however, are only mapping disjoint values from their respective locations for a limited amount of parameters. Using observations of witnesses represents one approach to enhance measured values from sensors ("humans as sensors"). These observations are increasingly disseminated via social media platforms. These "social sensors" offer several advantages over common sensors, e.g. high mobility, high versatility of captured parameters as well as rapid distribution of information. Moreover, the amount of data offered by social media platforms is quite extensive. We analyze messages distributed via Twitter after major earthquakes to get rapid information on what eye-witnesses report from the epicentral area. We use this information to (a) quickly learn about damage and losses to support fast disaster response and to (b) densify geophysical networks in areas where there is sparse information to gain a more detailed insight on felt intensities. We present a case study from the Mw 7.1 Philippines (Bohol) earthquake that happened on Oct. 15 2013. We extract Twitter messages, so called tweets containing one or more specified keywords from the semantic field of "earthquake" and use them for further analysis. For the time frame of Oct. 15 to Oct 18 we get a data base of in total 50.000 tweets whereof 2900 tweets are geo-localized and 470 have a photo attached. Analyses for both national level and locally for

  5. How citizen seismology is transforming rapid public earthquake information: the example of LastQuake smartphone application and Twitter QuakeBot (United States)

    Bossu, R.; Etivant, C.; Roussel, F.; Mazet-Roux, G.; Steed, R.


    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.

  6. CISN Display - Reliable Delivery of Real-time Earthquake Information, Including Rapid Notification and ShakeMap to Critical End Users (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Given, D.


    The California Integrated Seismic Network (CISN) Display is part of a Web-enabled earthquake notification system alerting users in near real-time of seismicity, and also valuable geophysical information following a large earthquake. It will replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering graphical earthquake information to users at emergency operations centers, and other organizations. Features distinguishing the CISN Display from other GUI tools are a state-full client/server relationship, a scalable message format supporting automated hyperlink creation, and a configurable platform-independent client with a GIS mapping tool; supporting the decision-making activities of critical users. The CISN Display is the front-end of a client/server architecture known as the QuakeWatch system. It is comprised of the CISN Display (and other potential clients), message queues, server, server "feeder" modules, and messaging middleware, schema and generators. It is written in Java, making it platform-independent, and offering the latest in Internet technologies. QuakeWatch's object-oriented design allows components to be easily upgraded through a well-defined set of application programming interfaces (APIs). Central to the CISN Display's role as a gateway to other earthquake products is its comprehensive XML-schema. The message model starts with the CUBE message format, but extends it by provisioning additional attributes for currently available products, and those yet to be considered. The supporting metadata in the XML-message provides the data necessary for the client to create a hyperlink and associate it with a unique event ID. Earthquake products deliverable to the CISN Display are ShakeMap, Ground Displacement, Focal Mechanisms, Rapid Notifications, OES Reports, and Earthquake Commentaries. Leveraging the power of the XML-format, the CISN Display provides prompt access to

  7. Seismogeodesy for rapid earthquake and tsunami characterization (United States)

    Bock, Y.


    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  8. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks (United States)

    Polet, J.; Thio, H. K.; Kremer, M.


    The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure

  9. The 2010 Chile Earthquake: Rapid Assessments of Tsunami


    Michelini, A.; Lauciani, V.; Selvaggi, G.; Lomax, A.


    After an earthquake underwater, rapid real-time assessment of earthquake parameters is important for emergency response related to infrastructure damage and, perhaps more exigently, for issuing warnings of the possibility of an impending tsunami. Since 2005, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) has worked on the rapid quantification of earthquake magnitude and tsunami potential, especially for the Mediterranean area. This work includes quantification of earthquake size fr...

  10. The key role of eyewitnesses in rapid earthquake impact assessment (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline


    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  11. The Temblor mobile seismic risk app, v2: Rapid and seamless earthquake information to inspire individuals to recognize and reduce their risk (United States)

    Stein, R. S.; Sevilgen, V.; Sevilgen, S.; Kim, A.; Jacobson, D. S.; Lotto, G. C.; Ely, G.; Bhattacharjee, G.; O'Sullivan, J.


    Temblor quantifies and personalizes earthquake risk and offers solutions by connecting users with qualified retrofit and insurance providers. Temblor's daily blog on current earthquakes, seismic swarms, eruptions, floods, and landslides makes the science accessible to the public. Temblor is available on iPhone, Android, and mobile web app platforms ( The app presents both scenario (worst case) and probabilistic (most likely) financial losses for homes and commercial buildings, and estimates the impact of seismic retrofit and insurance on the losses and safety. Temblor's map interface has clickable earthquakes (with source parameters and links) and active faults (name, type, and slip rate) around the world, and layers for liquefaction, landslides, tsunami inundation, and flood zones in the U.S. The app draws from the 2014 USGS National Seismic Hazard Model and the 2014 USGS Building Seismic Safety Council ShakeMap scenari0 database. The Global Earthquake Activity Rate (GEAR) model is used worldwide, with active faults displayed in 75 countries. The Temblor real-time global catalog is merged from global and national catalogs, with aftershocks discriminated from mainshocks. Earthquake notifications are issued to Temblor users within 30 seconds of their occurrence, with approximate locations and magnitudes that are rapidly refined in the ensuing minutes. Launched in 2015, Temblor has 650,000 unique users, including 250,000 in the U.S. and 110,000 in Chile, as well as 52,000 Facebook followers. All data shown in Temblor is gathered from authoritative or published sources and is synthesized to be intuitive and actionable to the public. Principal data sources include USGS, FEMA, EMSC, GEM Foundation, NOAA, GNS Science (New Zealand), INGV (Italy), PHIVOLCS (Philippines), GSJ (Japan), Taiwan Earthquake Model, EOS Singapore (Southeast Asia), MTA (Turkey), PB2003 (plate boundaries), CICESE (Baja California), California Geological Survey, and 20 other state

  12. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.


    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website ( which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  13. Real Time Earthquake Information System in Japan (United States)

    Doi, K.; Kato, T.


    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  14. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.


    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  15. Rapid estimation of the economic consequences of global earthquakes (United States)

    Jaiswal, Kishor; Wald, David J.


    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  16. Rapid earthquake magnitude determination for Vrancea early warning system

    International Nuclear Information System (INIS)

    Marmureanu, Alexandru


    Due to the huge amount of recorded data, an automatic procedure was developed and used to test different methods to rapidly evaluate earthquake magnitude from the first seconds of the P wave. In order to test all the algorithms involved in detection and rapid earthquake magnitude estimation, several tests were performed, in order to avoid false alarms. A special detection algorithm was developed, that is based on the classical STA/LTA algorithm and tuned for early warning purpose. A method to rapidly estimate magnitude in 4 seconds from detection of P wave in the epicenter is proposed. The method was tested on al recorded data, and the magnitude error determination is acceptable taking into account that it is computed from only 3 stations in a very short time interval. (author)

  17. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.


    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website ( is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  18. Earthquakes, fluid pressures and rapid subduction zone metamorphism (United States)

    Viete, D. R.


    High-pressure/low-temperature (HP/LT) metamorphism is commonly incomplete, meaning that large tracts of rock can remain metastable at blueschist- and eclogite-facies conditions for timescales up to millions of years [1]. When HP/LT metamorphism does take place, it can occur over extremely short durations (the role of fluids in providing heat for metamorphism [2] or catalyzing metamorphic reactions [1]. Earthquakes in subduction zone settings can occur to depths of 100s of km. Metamorphic dehydration and the associated development of elevated pore pressures in HP/LT metamorphic rocks has been identified as a cause of earthquake activity at such great depths [3-4]. The process of fracturing/faulting significantly increases rock permeability, causing channelized fluid flow and dissipation of pore pressures [3-4]. Thus, deep subduction zone earthquakes are thought to reflect an evolution in fluid pressure, involving: (1) an initial increase in pore pressure by heating-related dehydration of subduction zone rocks, and (2) rapid relief of pore pressures by faulting and channelized flow. Models for earthquakes at depth in subduction zones have focussed on the in situ effects of dehydration and then sudden escape of fluids from the rock mass following fracturing [3-4]. On the other hand, existing models for rapid and incomplete metamorphism in subduction zones have focussed only on the effects of heating and/or hydration with the arrival of external fluids [1-2]. Significant changes in pressure over very short timescales should result in rapid mineral growth and/or disequilibrium texture development in response to overstepping of mineral reaction boundaries. The repeated process of dehydration-pore pressure development-earthquake-pore pressure relief could conceivably produce a record of episodic HP/LT metamorphism driven by rapid pressure pulses. A new hypothesis is presented for the origins of HP/LT metamorphism: that HP/LT metamorphism is driven by effective pressure

  19. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments (United States)

    Chang, Jefferson C.; Lockner, David A.; Reches, Z.


    After nucleation, a large earthquake propagates as an expanding rupture front along a fault. This front activates countless fault patches that slip by consuming energy stored in Earth’s crust. We simulated the slip of a fault patch by rapidly loading an experimental fault with energy stored in a spinning flywheel. The spontaneous evolution of strength, acceleration, and velocity indicates that our experiments are proxies of fault-patch behavior during earthquakes of moment magnitude (Mw) = 4 to 8. We show that seismically determined earthquake parameters (e.g., displacement, velocity, magnitude, or fracture energy) can be used to estimate the intensity of the energy release during an earthquake. Our experiments further indicate that high acceleration imposed by the earthquake’s rupture front quickens dynamic weakening by intense wear of the fault zone.

  20. Rapid post-earthquake modelling of coseismic landslide intensity and distribution for emergency response decision support

    Directory of Open Access Journals (Sweden)

    T. R. Robinson


    Full Text Available Current methods to identify coseismic landslides immediately after an earthquake using optical imagery are too slow to effectively inform emergency response activities. Issues with cloud cover, data collection and processing, and manual landslide identification mean even the most rapid mapping exercises are often incomplete when the emergency response ends. In this study, we demonstrate how traditional empirical methods for modelling the total distribution and relative intensity (in terms of point density of coseismic landsliding can be successfully undertaken in the hours and days immediately after an earthquake, allowing the results to effectively inform stakeholders during the response. The method uses fuzzy logic in a GIS (Geographic Information Systems to quickly assess and identify the location-specific relationships between predisposing factors and landslide occurrence during the earthquake, based on small initial samples of identified landslides. We show that this approach can accurately model both the spatial pattern and the number density of landsliding from the event based on just several hundred mapped landslides, provided they have sufficiently wide spatial coverage, improving upon previous methods. This suggests that systematic high-fidelity mapping of landslides following an earthquake is not necessary for informing rapid modelling attempts. Instead, mapping should focus on rapid sampling from the entire affected area to generate results that can inform the modelling. This method is therefore suited to conditions in which imagery is affected by partial cloud cover or in which the total number of landslides is so large that mapping requires significant time to complete. The method therefore has the potential to provide a quick assessment of landslide hazard after an earthquake and may therefore inform emergency operations more effectively compared to current practice.

  1. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context (United States)

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul S.; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey


    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  2. An information infrastructure for earthquake science (United States)

    Jordan, T. H.; Scec/Itr Collaboration


    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  3. Recent applications for rapid estimation of earthquake shaking and losses with ELER Software

    International Nuclear Information System (INIS)

    Demircioglu, M.B.; Erdik, M.; Kamer, Y.; Sesetyan, K.; Tuzun, C.


    A methodology and software package entitled Earthquake Loss Estimation Routine (ELER) was developed for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region. The work was carried out under the Joint Research Activity-3 (JRA3) of the EC FP6 project entitled Network of Research Infrastructures for European Seismology (NERIES). The ELER methodology anticipates: 1) finding of the most likely location of the source of the earthquake using regional seismo-tectonic data base; 2) estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion prediction models, bias-correcting the ground motion estimations with strong ground motion data, if available; 3) estimation of the spatial distribution of site-corrected ground motion parameters using regional geology database using appropriate amplification models; and 4) estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties). The multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships which are coded into ELER. The present paper provides brief information on the methodology of ELER and provides an example application with the recent major earthquake that hit the Van province in the east of Turkey on 23 October 2011 with moment magnitude (Mw) of 7.2. For this earthquake, Kandilli Observatory and Earthquake Research Institute (KOERI) provided almost real time estimations in terms of building damage and casualty distribution using ELER. (author)

  4. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment (United States)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe


    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of

  5. SalanderMaps: A rapid overview about felt earthquakes through data mining of web-accesses (United States)

    Kradolfer, Urs


    While seismological observatories detect and locate earthquakes based on measurements of the ground motion, they neither know a priori whether an earthquake has been felt by the public nor is it known, where it has been felt. Such information is usually gathered by evaluating feedback reported by the public through on-line forms on the web. However, after a felt earthquake in Switzerland, many people visit the webpages of the Swiss Seismological Service (SED) at the ETH Zurich and each such visit leaves traces in the logfiles on our web-servers. Data mining techniques, applied to these logfiles and mining publicly available data bases on the internet open possibilities to obtain previously unknown information about our virtual visitors. In order to provide precise information to authorities and the media, it would be desirable to rapidly know from which locations these web-accesses origin. The method 'Salander' (Seismic Activitiy Linked to Area codes - Nimble Detection of Earthquake Rumbles) will be introduced and it will be explained, how the IP-addresses (each computer or router directly connected to the internet has a unique IP-address; an example would be of a sufficient amount of our virtual visitors were linked to their geographical area. This allows us to unprecedentedly quickly know whether and where an earthquake was felt in Switzerland. It will also be explained, why the method Salander is superior to commercial so-called geolocation products. The corresponding products of the Salander method, animated SalanderMaps, which are routinely generated after each earthquake with a magnitude of M>2 in Switzerland (, available after March 2013), demonstrate how the wavefield of earthquakes propagates through Switzerland and where it was felt. Often, such information is available within less than 60 seconds after origin time, and we always get a clear picture within already five minutes after origin time

  6. Rapid Source Characterization of the 2011 Mw 9.0 off the Pacific coast of Tohoku Earthquake (United States)

    Hayes, Gavin P.


    On March 11th, 2011, a moment magnitude 9.0 earthquake struck off the coast of northeast Honshu, Japan, generating what may well turn out to be the most costly natural disaster ever. In the hours following the event, the U.S. Geological Survey National Earthquake Information Center led a rapid response to characterize the earthquake in terms of its location, size, faulting source, shaking and slip distributions, and population exposure, in order to place the disaster in a framework necessary for timely humanitarian response. As part of this effort, fast finite-fault inversions using globally distributed body- and surface-wave data were used to estimate the slip distribution of the earthquake rupture. Models generated within 7 hours of the earthquake origin time indicated that the event ruptured a fault up to 300 km long, roughly centered on the earthquake hypocenter, and involved peak slips of 20 m or more. Updates since this preliminary solution improve the details of this inversion solution and thus our understanding of the rupture process. However, significant observations such as the up-dip nature of rupture propagation and the along-strike length of faulting did not significantly change, demonstrating the usefulness of rapid source characterization for understanding the first order characteristics of major earthquakes.

  7. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage (United States)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.


    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  8. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake (United States)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.


    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  9. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS) (United States)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center


    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  10. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy


    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (, consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  11. Performance of Real-time Earthquake Information System in Japan (United States)

    Nakamura, H.; Horiuchi, S.; Wu, C.; Yamamoto, S.


    Horiuchi et al. (2005) developed a real-time earthquake information system (REIS) using Hi-net, a densely deployed nationwide seismic network, which consists of about 800 stations operated by NIED, Japan. REIS determines hypocenter locations and earthquake magnitudes automatically within a few seconds after P waves arrive at the closest station and calculates focal mechanisms within about 15 seconds. Obtained hypocenter parameters are transferred immediately by using XML format to a computer in Japan Meteorological Agency (JMA), who started the service of EEW to special users in June 2005. JMA also developed EEW using 200 stations. The results by the two systems are merged. Among all the first issued EEW reports by both systems, REIS information accounts for about 80 percent. This study examines the rapidity and credibility of REIS by analyzing the 4050 earthquakes which occurred around the Japan Islands since 2005 with magnitude larger than 3.0. REIS re-determines hypocenter parameters every one second according to the revision of waveform data. Here, we discuss only about the results by the first reports. On rapidness, our results show that about 44 percent of the first reports are issued within 5 seconds after the P waves arrives at the closest stations. Note that this 5-second time window includes time delay due to data package and transmission delay of about 2 seconds. REIS waits till two stations detect P waves for events in the network but four stations outside the network so as to get reliable solutions. For earthquakes with hypocentral distance less than 100km, 55 percent of earthquakes are warned in 5 seconds and 87 percent are warned in 10 seconds. Most of events having long time delay are small and triggered by S wave arrivals. About 80 percent of events have difference in epicenter distances less than 20km relative to JMA manually determined locations. Because of the existence of large lateral heterogeneity in seismic velocity, the difference depends

  12. A rapid stability assessment of China's IGS sites after the Ms7. 0 Lushan earthquake

    Directory of Open Access Journals (Sweden)

    Meng Jie


    Full Text Available A rapid and accurate assessment of the stability of surveying and mapping reference points is important for post – disaster rescue, disaster relief and reconstruction activities. Using Precise Point Positioning (PPP technology, a rapid assessment of the stability of the IGS sites in China was performed after the Ms 7. 0 Lushan earthquake using rapid precise ephemeris and rapid precise satellite clock products. The results show that the earthquake had a very small impact and did not cause significant permanent deformation at the IGS sites. Most of the sites were unaffected and remained stable after the earthquake.

  13. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.


    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  14. Money matters: Rapid post-earthquake financial decision-making (United States)

    Wald, David J.; Franco, Guillermo


    Post-earthquake financial decision-making is a realm beyond that of many people. In the immediate aftermath of a damaging earthquake, billions of dollars of relief, recovery, and insurance funds are in the balance through new financial instruments that allow those with resources to hedge against disasters and those at risk to limit their earthquake losses and receive funds for response and recovery.

  15. Thumbnail‐based questionnaires for the rapid and efficient collection of macroseismic data from global earthquakes (United States)

    Bossu, Remy; Landes, Matthieu; Roussel, Frederic; Steed, Robert; Mazet-Roux, Gilles; Martin, Stacey S.; Hough, Susan E.


    The collection of earthquake testimonies (i.e., qualitative descriptions of felt shaking) is essential for macroseismic studies (i.e., studies gathering information on how strongly an earthquake was felt in different places), and when done rapidly and systematically, improves situational awareness and in turn can contribute to efficient emergency response. In this study, we present advances made in the collection of testimonies following earthquakes around the world using a thumbnail‐based questionnaire implemented on the European‐Mediterranean Seismological Centre (EMSC) smartphone app and its website compatible for mobile devices. In both instances, the questionnaire consists of a selection of thumbnails, each representing an intensity level of the European Macroseismic Scale 1998. We find that testimonies are collected faster, and in larger numbers, by way of thumbnail‐based questionnaires than by more traditional online questionnaires. Responses were received from all seismically active regions of our planet, suggesting that thumbnails overcome language barriers. We also observed that the app is not sufficient on its own, because the websites are the main source of testimonies when an earthquake strikes a region for the first time in a while; it is only for subsequent shocks that the app is widely used. Notably though, the speed of the collection of testimonies increases significantly when the app is used. We find that automated EMSC intensities as assigned by user‐specified thumbnails are, on average, well correlated with “Did You Feel It?” (DYFI) responses and with the three independently and manually derived macroseismic datasets, but there is a tendency for EMSC to be biased low with respect to DYFI at moderate and large intensities. We address this by proposing a simple adjustment that will be verified in future earthquakes.

  16. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.


    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  17. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.


    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  18. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments (United States)

    Chang, J. C.; Lockner, D. A.; Reches, Z.


    We simulated the slip of a fault-patch during a large earthquake by rapidly loading an experimental, ring-shaped fault with energy stored in a spinning flywheel. The flywheel abruptly delivers a finite amount of energy by spinning the fault-patch that spontaneously dissipates the energy without operator intervention. We conducted 42 experiments on Sierra White granite (SWG) samples, and 24 experiments on Kasota dolomite (KD) samples. Each experiment starts by spinning a 225 kg disk-shaped flywheel to a prescribed angular velocity. We refer to this experiment as an "earthquake-like slip-event" (ELSE). The strength-evolution in ELSE experiments is similar to the strength-evolution proposed for earthquake models and observed in stick-slip experiments. Further, we found that ELSE experiments are similar to earthquakes in at least three ways: (1) slip driven by the release of a finite amount of stored energy; (2) pattern of fault strength evolution; and (3) seismically observed values, such as average slip, peak-velocity and rise-time. By assuming that the measured slip, D, in ELSE experiments is equivalent to the average slip during an earthquake, we found that ELSE experiments (D = 0.003-4.6 m) correspond to earthquakes in moment-magnitude range of Mw = 4-8. In ELSE experiments, the critical-slip-distance, dc, has mean values of 2.7 cm and 1.2 cm for SWG and KD, that are much shorter than the 1-10 m in steady-state classical experiments in rotary shear systems. We attribute these dc values, to ELSE loading in which the fault-patch is abruptly loaded by impact with a spinning flywheel. Under this loading, the friction-velocity relations are strikingly different from those under steady-state loading on the same rock samples with the same shear system (Reches and Lockner, Nature, 2010). We further note that the slip acceleration in ELSE evolves systematically with fault strength and wear-rate, and that the dynamic weakening is restricted to the period of intense

  19. An efficient rapid warning system for earthquakes in the European - Mediterranean region

    International Nuclear Information System (INIS)

    Mazet-Roux, G.; Bossu, R; Tome, M.; Giovambattista, R. Di


    Every year a few damaging earthquakes occur in the European-Mediterranean region. It is therefore indispensable to operate a real-time warning system in order to provide rapidly reliable estimates of the location, depth and magnitude of these seismic events. In order to provide this information in a timely manner both to the scientific community and to the European and national authorities dealing with natural hazards and relief organisation, the European-Mediterranean Seismological Centre (EMSC) has federated a network of seismic networks exchanging their data in quasi real-time. Today, thanks to the Internet, the EMSC receives real-time information about earthquakes from about thirty seismological institutes. As soon as data reach the EMSC, they are displayed on the EMSC Web pages ( A seismic alert is generated for any potentially damaging earthquake in the European-Mediterranean region and disseminated within one hour following its occurrence. Potentially damaging earthquakes are defined as seismic events of magnitude 5 or above in the European-Mediterranean region. The utility of this EMSC service is clearly demonstrated by its following among the public: EMSC e-mail dissemination list has been subscribed by about 300 institutions (ECHO, NGO, civil defence services, seismological institutes) or individuals and the rate of internet connections to EMSC web site dramatically increase following an alert. The aim of this presentation is to give a complete technical description of the EMSC warning system. We will also take this opportunity to thank each of the contributing institutions for their support and efforts to enhance the system performances. (authors)

  20. Rapid estimation of the moment magnitude of large earthquake from static strain changes (United States)

    Itaba, S.


    The 2011 off the Pacific coast of Tohoku earthquake, of moment magnitude (Mw) 9.0, occurred on March 11, 2011. Based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency announced just after earthquake occurrence was 7.9, and it was considerably smaller than an actual value. On the other hand, using nine borehole strainmeters of Geological Survey of Japan, AIST, we estimated a fault model with Mw 8.7 for the earthquake on the boundary between the Pacific and North American plates. This model can be estimated about seven minutes after the origin time, and five minute after wave arrival. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami (e.g., Ohta et al., 2012). Our simple method of using strain steps is one of the strong methods for rapid estimation of the magnitude of great earthquakes.

  1. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing


    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  2. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China. (United States)

    Liu, Xu; Tang, Bihan; Yang, Hongyang; Liu, Yuan; Xue, Chen; Zhang, Lulu


    Performance assessments of earthquake medical rapid response teams (EMRRTs), particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  3. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China

    Directory of Open Access Journals (Sweden)

    Xu Liu


    Full Text Available Purpose: Performance assessments of earthquake medical rapid response teams (EMRRTs, particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Methods: Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. Results: The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. Conclusions: This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  4. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert


    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  5. Real-time earthquake monitoring: Early warning and rapid response (United States)


    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  6. LastQuake: a comprehensive strategy for rapid engagement of earthquake eyewitnesses, massive crowdsourcing and risk reduction (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Steed, R.; Frobert, L.


    LastQuake is a smartphone app, browser add-on and the most sophisticated Twitter robot (quakebot) for earthquakes currently in operation. It fulfills eyewitnesses' needs by offering information on felt earthquakes and their effects within tens of seconds of their occurrence. Associated with an active presence on Facebook, Pinterest and on websites, this proves a very efficient engagement strategy. For example, the app was installed thousands of times after the Ghorka earthquake in Nepal. Language barriers have been erased by using visual communication; for example, felt reports are collected through a set of cartoons representing different shaking levels. Within 3 weeks of the magnitude 7.8 Ghorka earthquakes, 7,000 felt reports with thousands of comments were collected related to the mainshock and tens of its aftershocks as well as 100 informative geo-located pics. The QuakeBot was essential in allowing us to be identified so well and interact with those affected. LastQuake is also a risk reduction tool since it provides rapid information. Rapid information is similar to prevention since when it does not exist, disasters can happen. When no information is available after a felt earthquake, the public block emergency lines by trying to find out the cause of the shaking, crowds form potentially leading to unpredictable crowd movement, rumors spread. In its next release LastQuake will also provide people with guidance immediately after a shaking through a number of pop-up cartoons illustrating "do/don't do" items (go to open places, do not phone emergency services except if people are injured…). LastQuake's app design is simple and intuitive and has a global audience. It benefited from a crowdfunding campaign (and the support of the Fondation MAIF) and more improvements have been planned after an online feedback campaign organized in early June with the Ghorka earthquake eyewitnesses. LastQuake is also a seismic risk reduction tools thanks to its very rapid

  7. Research on Collection of Earthquake Disaster Information from the Crowd (United States)

    Nian, Z.


    In China, the assessment of the earthquake disasters information is mainly based on the inversion of the seismic source mechanism and the pre-calculated population data model, the real information of the earthquake disaster is usually collected through the government departments, the accuracy and the speed need to be improved. And in a massive earthquake like the one in Mexico, the telecommunications infrastructure on ground were damaged , the quake zone was difficult to observe by satellites and aircraft in the bad weather. Only a bit of information was sent out through maritime satellite of other country. Thus, the timely and effective development of disaster relief was seriously affected. Now Chinese communication satellites have been orbiting, people don't only rely on the ground telecom base station to keep communication with the outside world, to open the web page,to land social networking sites, to release information, to transmit images and videoes. This paper will establish an earthquake information collection system which public can participate. Through popular social platform and other information sources, the public can participate in the collection of earthquake information, and supply quake zone information, including photos, video, etc.,especially those information made by unmanned aerial vehicle (uav) after earthqake, the public can use the computer, potable terminals, or mobile text message to participate in the earthquake information collection. In the system, the information will be divided into earthquake zone basic information, earthquake disaster reduction information, earthquake site information, post-disaster reconstruction information etc. and they will been processed and put into database. The quality of data is analyzed by multi-source information, and is controlled by local public opinion on them to supplement the data collected by government departments timely and implement the calibration of simulation results ,which will better guide

  8. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.


    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  9. Earthquakes (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  10. National Earthquake Information Center Seismic Event Detections on Multiple Scales (United States)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.


    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10

  11. Rapid Large Earthquake and Run-up Characterization in Quasi Real Time (United States)

    Bravo, F. J.; Riquelme, S.; Koch, P.; Cararo, S.


    Several test in quasi real time have been conducted by the rapid response group at CSN (National Seismological Center) to characterize earthquakes in Real Time. These methods are known for its robustness and realibility to create Finite Fault Models. The W-phase FFM Inversion, The Wavelet Domain FFM and The Body Wave and FFM have been implemented in real time at CSN, all these algorithms are running automatically and triggered by the W-phase Point Source Inversion. Dimensions (Large and Width ) are predefined by adopting scaling laws for earthquakes in subduction zones. We tested the last four major earthquakes occurred in Chile using this scheme: The 2010 Mw 8.8 Maule Earthquake, The 2014 Mw 8.2 Iquique Earthquake, The 2015 Mw 8.3 Illapel Earthquake and The 7.6 Melinka Earthquake. We obtain many solutions as time elapses, for each one of those we calculate the run-up using an analytical formula. Our results are in agreements with some FFM already accepted by the sicentific comunnity aswell as run-up observations in the field.

  12. Real-Time GPS Monitoring for Earthquake Rapid Assessment in the San Francisco Bay Area (United States)

    Guillemot, C.; Langbein, J. O.; Murray, J. R.


    The U.S. Geological Survey Earthquake Science Center has deployed a network of eight real-time Global Positioning System (GPS) stations in the San Francisco Bay area and is implementing software applications to continuously evaluate the status of the deformation within the network. Real-time monitoring of the station positions is expected to provide valuable information for rapidly estimating source parameters should a large earthquake occur in the San Francisco Bay area. Because earthquake response applications require robust data access, as a first step we have developed a suite of web-based applications which are now routinely used to monitor the network's operational status and data streaming performance. The web tools provide continuously updated displays of important telemetry parameters such as data latency and receive rates, as well as source voltage and temperature information within each instrument enclosure. Automated software on the backend uses the streaming performance data to mitigate the impact of outages, radio interference and bandwidth congestion on deformation monitoring operations. A separate set of software applications manages the recovery of lost data due to faulty communication links. Displacement estimates are computed in real-time for various combinations of USGS, Plate Boundary Observatory (PBO) and Bay Area Regional Deformation (BARD) network stations. We are currently comparing results from two software packages (one commercial and one open-source) used to process 1-Hz data on the fly and produce estimates of differential positions. The continuous monitoring of telemetry makes it possible to tune the network to minimize the impact of transient interruptions of the data flow, from one or more stations, on the estimated positions. Ongoing work is focused on using data streaming performance history to optimize the quality of the position, reduce drift and outliers by switching to the best set of stations within the network, and

  13. Conceptualizing ¬the Abstractions of Earthquakes Through an Instructional Sequence Using SeisMac and the Rapid Earthquake Viewer (United States)

    Taber, J.; Hubenthal, M.; Wysession, M.


    Newsworthy earthquakes provide an engaging hook for students in Earth science classes, particularly when discussing their effects on people and the landscape. However, engaging students in an analysis of earthquakes that extends beyond death and damage, is frequently hampered by the abstraction of recorded ground motion data in the form of raw seismograms and the inability of most students to personally relate to ground accelerations. To overcome these challenges, an educational sequence has been developed using two software tools: SeisMac by Daniel Griscom, and the Rapid Earthquake Viewer (REV) developed by the University of South Carolina in collaboration with IRIS and DLESE. This sequence presents a unique opportunity for Earth Science teachers to "create" foundational experiences for students as they construction a framework of understanding of abstract concepts. The first activity is designed to introduce the concept of a three-component seismogram and to directly address the very abstract nature of seismograms through a kinesthetic experience. Students first learn to take the pulse of their classroom through a guided exploration of SeisMac, which displays the output of the laptop's built-in Sudden Motion Sensor (a 3-component accelerometer). This exploration allows students to view a 3-component seismogram as they move or tap the laptop and encourages them to propose and carry out experiments to explain the meaning of the 3-component seismogram. Once completed students are then asked to apply this new knowledge to a real 3-component seismogram printed from REV. Next the activity guides students through the process of identifying P and S waves and using SeisMac to connect the physical motion of the laptop to the "wiggles" they see on the SeisMac display and then comparing those to the "wiggles" they see on their seismogram. At this point students are more fully prepared to engage in an S-P location exercise such as those included in many state standards

  14. Utilizing Information Technology to Facilitate Rapid Acquisition (United States)


    PAGES 109 14. SUBJECT TERMS Rapid Acquisition, eCommerce , eProcurement, Information Technology, Contracting, Global Information Network...Agency. 5 eCommerce and eProcurement, and possess an adequate knowledge of information technology. D. RESEARCH QUESTIONS 1. Primary Research... eCommerce , Information Technology, and eProcurement knowledge, and government and private industry reports utilizing numerous library and Internet

  15. Scientific Information Platform for the 2008 Great Wenchuan Earthquake (United States)

    Liang, C.


    The 2008 MS 8.0 Wenchuan earthquake is one of the deadliest in recent human history. This earthquake has not just united the whole world to help local people to lead their life through the difficult time, it has also fostered significant global cooperation to study this event from various aspects: including pre-seismic events (such as the seismicity, gravity, electro-magnetic fields, well water level, radon level in water etc), co-seismic events (fault slipping, landslides, man-made structure damages etc) and post-seismic events (such as aftershocks, well water level changing etc) as well as the disaster relief efforts. In the last four years, more than 300 scientific articles have been published on peer-reviewed journals, among them about 50% are published in Chinese, 30% in English, and about 20% in both languages. These researches have advanced our understanding of earthquake science in general. It has also sparked open debates in many aspects. Notably, the role of the Zipingpu reservoir (built not long ago before the earthquake) in the triggering of this monstrous earthquake is still one of many continuing debates. Given that all these articles are ssporadically spread out on different journals and numerous issues and in different languages, it can be very inefficient, sometimes impossible, to dig out the information that are in need. The Earthquake Research Group in the Chengdu University of Technology (ERGCDUT) has initiated an effort to develop an information platform to collect and analyze scientific research on or related to this earthquake, the hosting faults and the surrounding tectonic regions. A preliminary website has been setup for this purpose: Up to this point (July 2012), articles published in 6 Chinese journals and 7 international journals have been collected. Articles are listed journal by journal, and also grouped by contents into four major categories, including pre-seismic events, co-seismic events, post

  16. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center (United States)

    Patton, J.; Yeck, W.; Benz, H.


    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  17. An Earthquake Information Service with Free and Open Source Tools (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.


    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  18. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited) (United States)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.


    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  19. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru


    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  20. End-User Applications of Real-Time Earthquake Information in Europe (United States)

    Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team


    The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational

  1. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China

    Directory of Open Access Journals (Sweden)

    Shasha Li


    Full Text Available Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.

  2. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information (United States)

    Thompson, K. J.; Krantz, D. H.


    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  3. Rapid modeling of complex multi-fault ruptures with simplistic models from real-time GPS: Perspectives from the 2016 Mw 7.8 Kaikoura earthquake (United States)

    Crowell, B.; Melgar, D.


    The 2016 Mw 7.8 Kaikoura earthquake is one of the most complex earthquakes in recent history, rupturing across at least 10 disparate faults with varying faulting styles, and exhibiting intricate surface deformation patterns. The complexity of this event has motivated the need for multidisciplinary geophysical studies to get at the underlying source physics to better inform earthquake hazards models in the future. However, events like Kaikoura beg the question of how well (or how poorly) such earthquakes can be modeled automatically in real-time and still satisfy the general public and emergency managers. To investigate this question, we perform a retrospective real-time GPS analysis of the Kaikoura earthquake with the G-FAST early warning module. We first perform simple point source models of the earthquake using peak ground displacement scaling and a coseismic offset based centroid moment tensor (CMT) inversion. We predict ground motions based on these point sources as well as simple finite faults determined from source scaling studies, and validate against true recordings of peak ground acceleration and velocity. Secondly, we perform a slip inversion based upon the CMT fault orientations and forward model near-field tsunami maximum expected wave heights to compare against available tide gauge records. We find remarkably good agreement between recorded and predicted ground motions when using a simple fault plane, with the majority of disagreement in ground motions being attributable to local site effects, not earthquake source complexity. Similarly, the near-field tsunami maximum amplitude predictions match tide gauge records well. We conclude that even though our models for the Kaikoura earthquake are devoid of rich source complexities, the CMT driven finite fault is a good enough "average" source and provides useful constraints for rapid forecasting of ground motion and near-field tsunami amplitudes.

  4. A new quantitative method for the rapid evaluation of buildings against earthquakes

    International Nuclear Information System (INIS)

    Mahmoodzadeh, Amir; Mazaheri, Mohammad Mehdi


    At the present time there exist numerous weak buildings which are not able to withstand earthquakes. At the same time, both private and public developers are trying to use scientific methods to prioritize and allocate budget in order to reinforce the above mentioned structures. This is because of the limited financial resources and time. In the recent years the procedure of seismic assessment before rehabilitation of vulnerable buildings has been implemented in many countries. Now, it seems logical to reinforce the existing procedures with the mass of available data about the effects caused by earthquakes on buildings. The main idea is driven from FMEA (Failure Mode and Effect Analysis) in quality management where the main procedure is to recognize the failure, the causes, and the priority of each cause and failure. Specifying the causes and effects which lead to a certain shortcoming in structural behavior during earthquakes, an inventory is developed and each building is rated through a yes-or-no procedure. In this way, the rating of the structure is based on some standard forms which along with relative weights are developed in this study. The resulted criteria by rapid assessment will indicate whether the structure is to be demolished, has a high, medium or low vulnerability or is invulnerable

  5. Study on the Forecast of Ground Motion Parameters from Real Time Earthquake Information Based on Wave Form Data at the Front Site


    萩原, 由訓; 源栄, 正人; 三辻, 和弥; 野畑, 有秀; Yoshinori, HAGIWARA; Masato, MOTOSAKA; Kazuya, MITSUJI; Arihide, NOBATA; (株)大林組 技術研究所; 東北大学大学院工学研究科; 山形大学地域教育文化学部生活総合学科生活環境科学コース; (株)大林組 技術研究所; Obayashi Corporation Technical Research Institute; Graduate School of Eng., Tohoku University; Faculty of Education, Art and Science, Yamagata University


    The Japan Meteorological Agency(JMA) provides Earthquake Early Warnings(EEW) for advanced users from August 1, 2006. Advanced EEW users can forecaste seismic ground motion (example: Seismic Intensity, Peak Ground Acceleration) from information of the earthquake in EEW. But there are limits to the accuracy and the earliness of the forecasting. This paper describes regression equation to decrease the error and to increase rapidity of the forecast of ground motion parameters from Real Time Earth...

  6. Rapid estimation of the moment magnitude of the 2011 off the Pacific coast of Tohoku earthquake from coseismic strain steps (United States)

    Itaba, S.; Matsumoto, N.; Kitagawa, Y.; Koizumi, N.


    The 2011 off the Pacific coast of Tohoku earthquake, of moment magnitude (Mw) 9.0, occurred at 14:46 Japan Standard Time (JST) on March 11, 2011. The coseismic strain steps caused by the fault slip of this earthquake were observed in the Tokai, Kii Peninsula and Shikoku by the borehole strainmeters which were carefully set by Geological Survey of Japan, AIST. Using these strain steps, we estimated a fault model for the earthquake on the boundary between the Pacific and North American plates. Our model, which is estimated only from several minutes' strain data, is largely consistent with the final fault models estimated from GPS and seismic wave data. The moment magnitude can be estimated about 6 minutes after the origin time, and 4 minutes after wave arrival. According to the fault model, the moment magnitude of the earthquake is 8.7. On the other hand, based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency announced just after earthquake occurrence was 7.9. Generally coseismic strain steps are considered to be less reliable than seismic waves and GPS data. However our results show that the coseismic strain steps observed by the borehole strainmeters, which were carefully set and monitored, can be relied enough to decide the earthquake magnitude precisely and rapidly. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami. Our simple method of using strain steps is one of the strong methods for rapid estimation of the magnitude of great earthquakes.

  7. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.


    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  8. Letter to the Editor : Rapidly-deployed small tent hospitals: lessons from the earthquake in Haiti.

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, Y.; Gurman , P.; Verna, E.; Elman , N.; Labor, E. (Materials Science Division); (Superior NanoBioSystems LLC); (Fast Israeli Rescue & Search Team); (Clinique Adonai); (Mass. Inst. Tech.); (Univ. Haifa)


    The damage to medical facilities resulting form the January 2010 earthquake in haiti necessitated the establishment of field tent hospitals. Much of the local medical infrastructure was destroyed or limited operationally when the Fast Israel Rescue and Search Team (FIRST) arrived in Haiti shortly after the January 2010 earthquake. The FIRST deployed small tent hospitals in Port-au-Prince and in 11 remote areas outside of the city. Each tent was set up in less than a half hour. The tents were staffed with an orthopedic surgeon, gynecologists, primary care and emergency care physicians, a physician with previous experience in tropical medicine, nurses, paramedics, medics, and psychologists. The rapidly deployable and temporary nature of the effort allowed the team to treat and educate, as well as provide supplies for, thousands of refugees throughout Haiti. In addition, a local Haitian physician and his team created a small tent hospital to serve the Petion Refugee Camp and its environs. FIRST personnel also took shifts at this hospital.

  9. Information Theoric Framework for the Earthquake Recurrence Models : Methodica Firma Per Terra Non-Firma

    International Nuclear Information System (INIS)

    Esmer, Oezcan


    This paper first evaluates the earthquake prediction method (1999 ) used by US Geological Survey as the lead example and reviews also the recent models. Secondly, points out the ongoing debate on the predictability of earthquake recurrences and lists the main claims of both sides. The traditional methods and the 'frequentist' approach used in determining the earthquake probabilities cannot end the complaints that the earthquakes are unpredictable. It is argued that the prevailing 'crisis' in seismic research corresponds to the Pre-Maxent Age of the current situation. The period of Kuhnian 'Crisis' should give rise to a new paradigm based on the Information-Theoric framework including the inverse problem, Maxent and Bayesian methods. Paper aims to show that the information- theoric methods shall provide the required 'Methodica Firma' for the earthquake prediction models

  10. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins (United States)

    Zollo, Aldo


    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  11. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.


    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  12. Rapid Extraction of Landslide and Spatial Distribution Analysis after Jiuzhaigou Ms7.0 Earthquake Based on Uav Images (United States)

    Jiao, Q. S.; Luo, Y.; Shen, W. H.; Li, Q.; Wang, X.


    Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV) and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented analysis method, landslides image objects were obtained by multi-scale segmentation, and the feature rule set of each level was automatically built by SEaTH (Separability and Thresholds) algorithm to realize the rapid landslide extraction. Compared with visual interpretation, object-oriented automatic landslides extraction method achieved an accuracy of 94.3 %. The spatial distribution of the earthquake landslide had a significant positive correlation with slope and relief and had a negative correlation with the roughness, but no obvious correlation with the aspect. The relationship between the landslide and the aspect was not found and the probable reason may be that the distance between the study area and the seismogenic fault was too far away. This work provided technical support for the earthquake field emergency, earthquake landslide prediction and disaster loss assessment.


    Directory of Open Access Journals (Sweden)

    Q. S. Jiao


    Full Text Available Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented analysis method, landslides image objects were obtained by multi-scale segmentation, and the feature rule set of each level was automatically built by SEaTH (Separability and Thresholds algorithm to realize the rapid landslide extraction. Compared with visual interpretation, object-oriented automatic landslides extraction method achieved an accuracy of 94.3 %. The spatial distribution of the earthquake landslide had a significant positive correlation with slope and relief and had a negative correlation with the roughness, but no obvious correlation with the aspect. The relationship between the landslide and the aspect was not found and the probable reason may be that the distance between the study area and the seismogenic fault was too far away. This work provided technical support for the earthquake field emergency, earthquake landslide prediction and disaster loss assessment.

  14. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team


    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  15. Earthquake Magnitude and Shaking Intensity Dependent Fragility Functions for Rapid Risk Assessment of Buildings

    Directory of Open Access Journals (Sweden)

    Marie-José Nollet


    Full Text Available An integrated web application, referred to as ER2 for rapid risk evaluator, is under development for a user-friendly seismic risk assessment by the non-expert public safety community. The assessment of likely negative consequences is based on pre-populated databases of seismic, building inventory and vulnerability parameters. To further accelerate the computation for near real-time analyses, implicit building fragility curves were developed as functions of the magnitude and the intensity of the seismic shaking defined with a single intensity measure, input spectral acceleration at 1.0 s implicitly considering the epicentral distance and local soil conditions. Damage probabilities were compared with those obtained with the standard fragility functions explicitly considering epicentral distances and local site classes in addition to the earthquake magnitudes and respective intensity of the seismic shaking. Different seismic scenarios were considered first for 53 building classes common in Eastern Canada, and then a reduced number of 24 combined building classes was proposed. Comparison of results indicate that the damage predictions with implicit fragility functions for short (M ≤ 5.5 and medium strong motion duration (5.5 < M ≤ 7.5 show low variation with distance and soil class, with average error of less than 3.6%.

  16. Crowdsourcing Rapid Assessment of Collapsed Buildings Early after the Earthquake Based on Aerial Remote Sensing Image: A Case Study of Yushu Earthquake

    Directory of Open Access Journals (Sweden)

    Shuai Xie


    Full Text Available Remote sensing (RS images play a significant role in disaster emergency response. Web2.0 changes the way data are created, making it possible for the public to participate in scientific issues. In this paper, an experiment is designed to evaluate the reliability of crowdsourcing buildings collapse assessment in the early time after an earthquake based on aerial remote sensing image. The procedure of RS data pre-processing and crowdsourcing data collection is presented. A probabilistic model including maximum likelihood estimation (MLE, Bayes’ theorem and expectation-maximization (EM algorithm are applied to quantitatively estimate the individual error-rate and “ground truth” according to multiple participants’ assessment results. An experimental area of Yushu earthquake is provided to present the results contributed by participants. Following the results, some discussion is provided regarding accuracy and variation among participants. The features of buildings labeled as the same damage type are found highly consistent. This suggests that the building damage assessment contributed by crowdsourcing can be treated as reliable samples. This study shows potential for a rapid building collapse assessment through crowdsourcing and quantitatively inferring “ground truth” according to crowdsourcing data in the early time after the earthquake based on aerial remote sensing image.

  17. Rupture complexity of the Mw 8.3 sea of okhotsk earthquake: Rapid triggering of complementary earthquakes? (United States)

    Wei, Shengji; Helmberger, Don; Zhan, Zhongwen; Graves, Robert


    We derive a finite slip model for the 2013 Mw 8.3 Sea of Okhotsk Earthquake (Z = 610 km) by inverting calibrated teleseismic P waveforms. The inversion shows that the earthquake ruptured on a 10° dipping rectangular fault zone (140 km × 50 km) and evolved into a sequence of four large sub-events (E1–E4) with an average rupture speed of 4.0 km/s. The rupture process can be divided into two main stages. The first propagated south, rupturing sub-events E1, E2, and E4. The second stage (E3) originated near E2 with a delay of 12 s and ruptured northward, filling the slip gap between E1 and E2. This kinematic process produces an overall slip pattern similar to that observed in shallow swarms, except it occurs over a compressed time span of about 30 s and without many aftershocks, suggesting that sub-event triggering for deep events is significantly more efficient than for shallow events.

  18. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program (United States)

    Perry, S.; Jordan, T.


    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  19. Seismogeodesy of the 2014 Mw6.1 Napa earthquake, California: Rapid response and modeling of fast rupture on a dipping strike-slip fault (United States)

    Melgar, Diego; Geng, Jianghui; Crowell, Brendan W.; Haase, Jennifer S.; Bock, Yehuda; Hammond, William C.; Allen, Richard M.


    Real-time high-rate geodetic data have been shown to be useful for rapid earthquake response systems during medium to large events. The 2014 Mw6.1 Napa, California earthquake is important because it provides an opportunity to study an event at the lower threshold of what can be detected with GPS. We show the results of GPS-only earthquake source products such as peak ground displacement magnitude scaling, centroid moment tensor (CMT) solution, and static slip inversion. We also highlight the retrospective real-time combination of GPS and strong motion data to produce seismogeodetic waveforms that have higher precision and longer period information than GPS-only or seismic-only measurements of ground motion. We show their utility for rapid kinematic slip inversion and conclude that it would have been possible, with current real-time infrastructure, to determine the basic features of the earthquake source. We supplement the analysis with strong motion data collected close to the source to obtain an improved postevent image of the source process. The model reveals unilateral fast propagation of slip to the north of the hypocenter with a delayed onset of shallow slip. The source model suggests that the multiple strands of observed surface rupture are controlled by the shallow soft sediments of Napa Valley and do not necessarily represent the intersection of the main faulting surface and the free surface. We conclude that the main dislocation plane is westward dipping and should intersect the surface to the east, either where the easternmost strand of surface rupture is observed or at the location where the West Napa fault has been mapped in the past.

  20. Some rapid and long traveled landslides triggered by the May 12, 2008 Sichuan earthquake (United States)

    Wang, G.; Kamai, T.; Chigira, M.; Wu, X. Y.; Zhang, D. X.


    On May 12, 2008, a 7.9M earthquake struck Sichuan province of China, causing a huge number of death and injuries, and great loss of properties, becoming the most damaging earthquake since the 1976 Tangshan earthquake, in China. The collapse of buildings during the earthquake is the main reason for the casualties. There are a huge number of landslides that had been triggered by this earthquake. Almost all the roads to the mountainous areas had been blocked and many dams were formed by the displaced landslide materials, resulting in great difficulties for the aftershock rescue activities. Also a big portion of the casualties was directly caused by the landslides. The authors had reconnaissance field trips of the landslides, and performed preliminary investigation on some of the catastrophic ones. In this report, four landslides, i.e., Xiejiadian landslide in Pengzhou city, Donghekou landslide and Magongxiang landslide in Qingchuan County, and Niujuangou landslide on the epicenter area of Yingxiu Town, are introduced. The characteristics of deposited landslide masses in Donghekou landslide were investigated by means of a multichannel surface wave technique. Two earthquake recorders were installed at the upper part and deposit area of Donghekou landslide. The seismic responses of different parts of the landslides were monitored, and recorded successfully during the aftershocks that occurred in Qingchuan County on July 24, 2008. Also the drained and undrained dynamic shear behaviors of samples from the landslide areas were examined. Some preliminary analyzing results will be presented in this report.


    Takahashi, Masanori; Takayama, Jun-Ichi; Nakayama, Shoichiro

    Noto Peninsula earthquake occurred in Ishikawa Pref., in March, 2007, and the Noto Yuryo, and many arterial roads were damaged. This led to the conosiderable confusion of the road traffic in Noto Peninsula area and gave the influence on all kinds of social/economic activities. Therefore, an method of providing the traffic information for drivers is important in the case of disasters such as earthquakes. We carried out a questionnaire survey for local inhabitants and investigated the road use situation at the time of the Noto Peninsula earthquake and the information acquisition situation about it. We also analyzed whether or not the method of providing the traffic information was appropriate. In addition, we examined the best traffic information in the case of earthquakes.

  2. Earthquake ethics through scientific knowledge, historical memory and societal awareness: the experience of direct internet information. (United States)

    de Rubeis, Valerio; Sbarra, Paola; Sebaste, Beppe; Tosi, Patrizia


    The experience of collection of data on earthquake effects and diffusion of information to people, carried on through the site "" (didyoufeelit) managed by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), has evidenced a constantly growing interest by Italian citizens. Started in 2007, the site has collected more than 520,000 compiled intensity questionnaires, producing intensity maps of almost 6,000 earthquakes. One of the most peculiar feature of this experience is constituted by a bi-directional information exchange. Every person can record observed effects of the earthquake and, at the same time, look at the generated maps. Seismologists, on the other side, can find each earthquake described in real time through its effects on the whole territory. In this way people, giving punctual information, receive global information from the community, mediated and interpreted by seismological knowledge. The relationship amongst seismologists, mass media and civil society is, thus, deep and rich. The presence of almost 20,000 permanent subscribers distributed on the whole Italian territory, alerted in case of earthquake, has reinforced the participation: the subscriber is constantly informed by the seismologists, through e-mail, about events occurred in his-her area, even if with very small magnitude. The "alert" service provides the possibility to remember that earthquakes are a phenomenon continuously present, on the other hand it shows that high magnitude events are very rare. This kind of information is helpful as it is fully complementary to that one given by media. We analyze the effects of our activity on society and mass media. The knowledge of seismic phenomena is present in each person, having roots on fear, idea of death and destruction, often with the deep belief of very rare occurrence. This position feeds refusal and repression. When a strong earthquake occurs, surprise immediately changes into shock and desperation. A

  3. Automatic recognition of damaged town buildings caused by earthquake using remote sensing information: Taking the 2001 Bhuj, India, earthquake and the 1976 Tangshan, China, earthquake as examples (United States)

    Liu, Jia-Hang; Shan, Xin-Jian; Yin, Jing-Yuan


    In the high-resolution images, the undamaged buildings generally show a natural textural feature, while the damaged or semi-damaged buildings always exhibit some low-grayscale blocks because of their coarsely damaged sections. If we use a proper threshold to classify the grayscale of image, some independent holes will appear in the damaged regions. By using such statistical information as the number of holes in every region, or the ratio between the area of holes and that of the region, etc, the damaged buildings can be separated from the undamaged, thus automatic detection of damaged buildings can be realized. Based on these characteristics, a new method to automatically detect the damage buildings by using regional structure and statistical information of texture is presented in the paper. In order to test its validity, 1-m-resolution iKonos merged image of the 2001 Bhuj earthquake and grayscale aerial photos of the 1976 Tangshan earthquake are selected as two examples to automatically detect the damaged buildings. Satisfied results are obtained.

  4. Rapid weather information dissemination in Florida (United States)

    Martsolf, J. D.; Heinemann, P. H.; Gerber, J. F.; Crosby, F. L.; Smith, D. L.


    The development of the Florida Agricultural Services and Technology (FAST) plan to provide ports for users to call for weather information is described. FAST is based on the Satellite Frost Forecast System, which makes a broad base of weather data available to its users. The methods used for acquisition and dissemination of data from various networks under the FAST plan are examined. The system provides color coded IR or thermal maps, precipitation maps, and textural forecast information. A diagram of the system is provided.

  5. Rapid Response Products of The ARIA Project for the M6.0 August 24, 2014 South Napa Earthquake (United States)

    Yun, S. H.; Owen, S. E.; Hua, H.; Milillo, P.; Fielding, E. J.; Hudnut, K. W.; Dawson, T. E.; Mccrink, T. P.; Jo, M. J.; Barnhart, W. D.; Manipon, G. J. M.; Agram, P. S.; Moore, A. W.; Jung, H. S.; Webb, F.; Milillo, G.; Rosinski, A.


    A magnitude 6.0 earthquake struck southern Napa county northeast of San Francisco, California, on Aug. 24, 2014, causing significant damage in the city of Napa and nearby areas. One day after the earthquake, the Advanced Rapid Imaging and Analysis (ARIA) team produced and released observations of coseismic ground displacement measured with continuous GPS stations of the Plate Boundary Observatory (operated by UNAVCO for the National Science Foundation) and the Bay Area Rapid Deformation network (operated by Berkeley Seismological Laboratory). Three days after the earthquake (Aug. 27), the Italian Space Agency's (ASI) COSMO-SkyMed (CSK) satellite acquired their first post-event data. On the same day, the ARIA team, in collaboration with ASI and University of Basilicata, produced and released a coseismic interferogram that revealed ground deformation and surface rupture. The depiction of the surface rupture - discontinuities of color fringes in the CSK interferogram - helped guide field geologists from the US Geological Survey and the California Geological Survey (CGS) to features that may have otherwise gone undetected. Small-scale cracks were found on a runway of the Napa County Airport, as well as bridge damage and damaged roads. ARIA's response to this event highlighted the importance of timeliness for mapping surface deformation features. ARIA's rapid response products were shared through Southern California Earthquake Center's response website and the California Earthquake Clearinghouse. A damage proxy map derived from InSAR coherence of CSK data was produced and distributed on Aug. 27. Field crews from the CGS identified true and false positives, including mobile home damage, newly planted grape vines, and a cripple wall failure of a house. Finite fault slip models constrained from CSK interferograms and continuous GPS observations reveal a north-propagating rupture with well-resolved slip from 0-10.5 km depth. We also measured along-track coseismic

  6. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.


    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  7. Medical Information & Technology: Rapidly Expanding Vast Horizons (United States)

    Sahni, Anil K.


    During ÑMedical Council Of India?, Platinum Jubilee Year (1933-2008) Celebrations, In Year 2008, Several Scientific Meeting/Seminar/Symposium, On Various Topics Of Contemporary Importance And Relevance In The Field Of ÑMedical Education And Ethics?, Were Organized, By Different Medical Colleges At Various Local, State, National Levels. The Present Discussion, Is An Comprehensive Summary Of Various Different Aspects of ìMedical Information Communication Technologyî, Especially UseFul For The Audience Stratum Group Of Those Amateur Medical & Paramedical Staff, With No Previous Work Experience Knowledge Of Computronics Applications. Outlining The, i.Administration Applications: Medical Records Etc, ii. Clinical Applications: Pros pective Scope Of TeleMedicine Applicabilities Etc iii. Other Applications: Efforts To Augment Improvement Of Medical Education, Medical Presentations, Medical Education And Research Etc. ÑMedical Trancription? & Related Recent Study Fields e.g ÑModern Pharmaceuticals?,ÑBio-Engineering?, ÑBio-Mechanics?, ÑBio-Technology? Etc., Along With Important Aspects Of Computers-General Considerations, Computer Ergonomics Assembled To Summarize, The AwareNess Regarding Basic Fundamentals Of Medical Computronics & Its Practically SuccessFul Utilities.

  8. Rapid GNSS and Data Communication System Deployments In Chile and Argentina Following the M8.8 Maule Earthquake (United States)

    Blume, F.; Meertens, C. M.; Brooks, B. A.; Bevis, M. G.; Smalley, R.; Parra, H.; Baez, J.


    Because the signal is so big, great earthquakes allow us to make quantum leaps in our understanding of Earth deformation process and material properties. The Maule earthquake, with its occurrence near a large subaerial landmass and the large numbers of instruments available to study it, will surely become one of the most important geophysical events in modern memory. Much of the important signal, however, decays and changes rapidly in the short-term following the event and so a rapid response is necessary. Actually delivering the data from the CGPS response stations, however, represents an intellectual challenge in terms of properly matching the engineering realities with the scientific desiderata. We expect multiple major science advances to come from these data: (1) Understanding earthquake and tsunami-genesis via use of the coseismic displacement field to create the most well-constrained fault slip and tsunami-genesis models. (2) The role of stress loading on both the principal thrust plane and subsidiary planes. (3) The relationship between fault afterslip to the main event as well as to the distribution of aftershocks (4) Study of large aftershocks jointly using conventional seismology and high-rate GPS coseismic displacement seismogram. (5) Rheological behavior of the fault interface. (6) The mechanical response of the bulk earth to large stress perturbations. Within 10 days of the earthquake 20 complete GPS systems were delivered by UNAVCO personnel to IGM and OSU staff in Santiago, and 5 were shipped via diplomatic pouch to Argentina. Consisting of of 10 Trimble NetRS and 15 Topcon GB-1000 receivers, the units were deployed througout the affected area during the following three weeks, using welded-in-place steel tripod monuments driven into soil or drilled into bedrock, or steel masts. Additional GPS hardware was procured from cooperating institutions and donated by GPS manufacturers, and a total of 43 post-earthquake GPS stations are continuously operating

  9. Seeking Information after the 2010 Haiti Earthquake: A Case Study in Mass-Fatality Management (United States)

    Gupta, Kailash


    The 2010 earthquake in Haiti, which killed an estimated 316,000 people, offered many lessons in mass-fatality management (MFM). The dissertation defined MFM in seeking information and in recovery, preservation, identification, and disposition of human remains. Specifically, it examined how mass fatalities were managed in Haiti, how affected…

  10. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center (United States)

    Ruppert, N. A.; Hansen, R. A.


    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  11. Change of risk information disclosure in annual report. Before and after earthquake disaster

    International Nuclear Information System (INIS)

    Ueno, Takefumi


    This research examines how risk information disclosure is changing in annual report before and after East Japan Great Earthquake Disaster. Company voluntary disclose risk information in annual report. Manager can decide a style and items of risk information. This paper explores risk information disclosures of Tokyo Electric Power Company, Chubu Power Electric Company, Kansai Electric Power Company and Toyota Motor Corporation. The managers except Tokyo Electric Company are likely to disclose own catastrophe risk before the disaster. However, they do not try to reduce their risk. Corporations' risk information do not link with own risk management. (author)

  12. Using JavaScript and the FDSN web service to create an interactive earthquake information system (United States)

    Fischer, Kasper D.


    The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 ( software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.

  13. OMG Earthquake! Can Twitter improve earthquake response? (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.


    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  14. Rapid Deterioration of Latent HBV Hepatitis during Cushing Disease and Posttraumatic Stress Disorder after Earthquake. (United States)

    Tashiro, Ryosuke; Ogawa, Yoshikazu; Tominaga, Teiji


    Reactivation of the hepatitis B virus (HBV) is a risk in the 350 million HBV carriers worldwide. HBV reactivation may cause hepatocellular carcinoma, cirrhosis, and fulminant hepatitis, and HBV reactivation accompanied with malignant tumor and/or chemotherapy is a critical problem for patients with chronic HBV infection. Multiple risk factors causing an immunosuppressive state can also induce HBV reactivation.We present a case of HBV reactivation during an immunosuppressive state caused by Cushing disease and physical and psychological stress after a disaster. A 47-year-old Japanese woman was an inactive HBV carrier until the Great East Japan Earthquake occurred and follow-up was discontinued. One year after the earthquake she had intractable hypertension, and her visual acuity gradually worsened. Head magnetic resonance imaging showed a sellar tumor compressing the optic chiasm, and hepatic dysfunction with HBV reactivation was identified. Endocrinologic examination established the diagnosis as Cushing disease. After normalization of hepatic dysfunction with antiviral therapy, transsphenoidal tumor removal was performed that resulted in subtotal removal except the right cavernous portion. Steroid hormone supplementation was discontinued after 3 days of administration, and gamma knife therapy was performed for the residual tumor. Eighteen months after the operation, adrenocorticotropic hormone and cortisol values returned to normal. The patient has been free from tumor regrowth and HBV reactivation throughout the postoperative course.Accomplishment of normalization with intrinsic steroid value with minimization of steroid supplementation should be established. Precise operative procedures and careful treatment planning are essential to avoid HBV reactivation in patients with this threatening condition. Georg Thieme Verlag KG Stuttgart · New York.

  15. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  16. The role of INGVterremoti blog in information management during the earthquake sequence in central Italy

    Directory of Open Access Journals (Sweden)

    Maurizio Pignone


    Full Text Available In this paper, we describe the role the INGVterremoti blog in information management during the first part of the earthquake sequence in central Italy (August 24 to September 30. In the last four years, we have been working on the INGVterremoti blog in order to provide quick updates on the ongoing seismic activity in Italy and in-depth scientific information. These include articles on specific historical earthquakes, seismic hazard, geological interpretations, source models from different type of data, effects at the surface, and so on. We have delivered information in quasi-real-time also about all the recent magnitude M≥4.0 earthquakes in Italy, the strongest events in the Mediterranean and in the world. During the 2016 central Italy, the INGVterremoti blog has continuously released information about seismic sequences with three types of posts: i updates on the ongoing seismic activity; ii reports on the activities carried out by the INGV teams in the field and any other working groups; iii in-depth scientific articles describing some specific analysis and results. All the blog posts have been shared automatically and in real time on the other social media of the INGVterremoti platform, also to counter the bad information and to fight rumors. These include Facebook, Twitter and INGVterremoti App on IOS and Android. As well, both the main INGV home page ( and the INGV earthquake portal ( have published the contents of the blog on dedicated pages that were fed automatically. The work done day by day on the INGVterremoti blog has been coordinated with the INGV Press Office that has written several press releases based on the contents of the blog. Since August 24, 53 articles were published on the blog they have had more than 1.9 million views and 1 million visitors. The peak in the number of views, which was more than 800,000 in a single day, was registered on August 24, 2016, following the M 6

  17. The population in China’s earthquake-prone areas has increased by over 32 million along with rapid urbanization (United States)

    He, Chunyang; Huang, Qingxu; Dou, Yinyin; Tu, Wei; Liu, Jifu


    Accurate assessments of the population exposed to seismic hazard are crucial in seismic risk mapping. Recent rapid urbanization in China has resulted in substantial changes in the size and structure of the population exposed to seismic hazard. Using the latest population census data and seismic maps, this work investigated spatiotemporal changes in the exposure of the population in the most seismically hazardous areas (MSHAs) in China from 1990 to 2010. In the context of rapid urbanization and massive rural-to-urban migration, nearly one-tenth of the Chinese population in 2010 lived in MSHAs. From 1990 to 2010, the MSHA population increased by 32.53 million at a significantly higher rate of change (33.6%) than the national average rate (17.7%). The elderly population in MSHAs increased by 81.4%, which is much higher than the group’s national growth rate of 58.9%. Greater attention should be paid to the demographic changes in earthquake-prone areas in China.

  18. A smartphone application for earthquakes that matter! (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert


    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  19. A rapid review of consumer health information needs and preferences. (United States)

    Ramsey, Imogen; Corsini, Nadia; Peters, Micah D J; Eckert, Marion


    This rapid review summarizes best available evidence on consumers' needs and preferences for information about healthcare, with a focus on the Australian context. Three questions are addressed: 1) Where do consumers find and what platform do they use to access information about healthcare? 2) How do consumers use the healthcare information that they find? 3) About which topics or subjects do consumers need healthcare information? A hierarchical approach was adopted with evidence first sought from reviews then high quality studies using Medline (via PubMed), CINAHL, Embase, the JBI Database of Systematic Reviews and Implementation Reports, the Campbell Collaboration Library of Systematic Reviews, EPPI-Centre, and Epistemonikos. Twenty-eight articles were included; four systematic reviews, three literature reviews, thirteen quantitative studies, six qualitative studies, and two mixed methods studies. Consumers seek health information at varying times along the healthcare journey and through various modes of delivery. Complacency with historical health information modes is no longer appropriate and flexibility is essential to suit growing consumer demands. Health information should be readily available in different formats and not exclusive to any single medium. Copyright © 2017. Published by Elsevier B.V.

  20. Rapid automatic keyword extraction for information retrieval and analysis (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA


    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  1. CISN Display Progress to Date - Reliable Delivery of Real-Time Earthquake Information, and ShakeMap to Critical End Users (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Frechette, K.; Given, D.


    The California Integrated Seismic Network (CISN) has collaborated to develop a next-generation earthquake notification system that is nearing its first operations-ready release. The CISN Display actively alerts users of seismic data, and vital earthquake hazards information following a significant event. It will primarily replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering geographical seismic data to emergency operations centers, utility companies and media outlets. A subsequent goal is to provide automated access to the many Web products produced by regional seismic networks after an earthquake. Another aim is to create a highly configurable client, allowing user organizations to overlay infrastructure data critical to their roles as first-responders, or lifeline operators. And the final goal is to integrate these requirements, into a package offering several layers of reliability to ensure delivery of services. Central to the CISN Display's role as a gateway to Web-based earthquake products is its comprehensive XML-messaging schema. The message model uses many of the same attributes in the CUBE format, but extends the old standard by provisioning additional elements for products currently available, and others yet to be considered. The client consumes these XML-messages, sorts them through a resident Quake Data Merge filter, and posts updates that also include hyperlinks associated to specific event IDs on the display map. Earthquake products available for delivery to the CISN Display are ShakeMap, focal mechanisms, waveform data, felt reports, aftershock forecasts and earthquake commentaries. By design the XML-message schema can evolve as products and information needs change, without breaking existing applications that rely on it. The latest version of the CISN Display can also automatically download ShakeMaps and display shaking intensity within the GIS system. This

  2. Incorporating indel information into phylogeny estimation for rapidly emerging pathogens

    Directory of Open Access Journals (Sweden)

    Suchard Marc A


    Full Text Available Abstract Background Phylogenies of rapidly evolving pathogens can be difficult to resolve because of the small number of substitutions that accumulate in the short times since divergence. To improve resolution of such phylogenies we propose using insertion and deletion (indel information in addition to substitution information. We accomplish this through joint estimation of alignment and phylogeny in a Bayesian framework, drawing inference using Markov chain Monte Carlo. Joint estimation of alignment and phylogeny sidesteps biases that stem from conditioning on a single alignment by taking into account the ensemble of near-optimal alignments. Results We introduce a novel Markov chain transition kernel that improves computational efficiency by proposing non-local topology rearrangements and by block sampling alignment and topology parameters. In addition, we extend our previous indel model to increase biological realism by placing indels preferentially on longer branches. We demonstrate the ability of indel information to increase phylogenetic resolution in examples drawn from within-host viral sequence samples. We also demonstrate the importance of taking alignment uncertainty into account when using such information. Finally, we show that codon-based substitution models can significantly affect alignment quality and phylogenetic inference by unrealistically forcing indels to begin and end between codons. Conclusion These results indicate that indel information can improve phylogenetic resolution of recently diverged pathogens and that alignment uncertainty should be considered in such analyses.

  3. An attempt of using straight-line information for building damage detection based only on post-earthquake optical imagery

    International Nuclear Information System (INIS)

    Dong, Laigen; Ye, Yuanxin; Shan, Jie


    It is important to grasp damage information in stricken areas after an earthquake in order to perform quick rescue and recovery activities. Recent research into remote sensing techniques has shown significant ability to generate quality damage information. The methods based on only post-earthquake data are widely researched especially because there are no pre-earthquake reference data in many cities of the world. This paper addresses a method for detection of damaged buildings using only post-event satellite imagery so that scientists and researchers can take advantage of the ability of helicopters and airplanes to fly over the damage faster. Statistical information of line segments extracted from post-event satellite imagery, such as mean length (ML) and weighted tilt angel standard deviation (WTASD), are used for discriminating the damaged and undamaged buildings

  4. SCARDEC: a new technique for the rapid determination of seismic moment magnitude, focal mechanism and source time functions for large earthquakes using body-wave deconvolution (United States)

    Vallée, M.; Charléty, J.; Ferreira, A. M. G.; Delouis, B.; Vergoz, J.


    Accurate and fast magnitude determination for large, shallow earthquakes is of key importance for post-seismic response and tsumami alert purposes. When no local real-time data are available, which is today the case for most subduction earthquakes, the first information comes from teleseismic body waves. Standard body-wave methods give accurate magnitudes for earthquakes up to Mw= 7-7.5. For larger earthquakes, the analysis is more complex, because of the non-validity of the point-source approximation and of the interaction between direct and surface-reflected phases. The latter effect acts as a strong high-pass filter, which complicates the magnitude determination. We here propose an automated deconvolutive approach, which does not impose any simplifying assumptions about the rupture process, thus being well adapted to large earthquakes. We first determine the source duration based on the length of the high frequency (1-3 Hz) signal content. The deconvolution of synthetic double-couple point source signals—depending on the four earthquake parameters strike, dip, rake and depth—from the windowed real data body-wave signals (including P, PcP, PP, SH and ScS waves) gives the apparent source time function (STF). We search the optimal combination of these four parameters that respects the physical features of any STF: causality, positivity and stability of the seismic moment at all stations. Once this combination is retrieved, the integration of the STFs gives directly the moment magnitude. We apply this new approach, referred as the SCARDEC method, to most of the major subduction earthquakes in the period 1990-2010. Magnitude differences between the Global Centroid Moment Tensor (CMT) and the SCARDEC method may reach 0.2, but values are found consistent if we take into account that the Global CMT solutions for large, shallow earthquakes suffer from a known trade-off between dip and seismic moment. We show by modelling long-period surface waves of these events that

  5. Connecting slow earthquakes to huge earthquakes. (United States)

    Obara, Kazushige; Kato, Aitaro


    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  6. Incorporating Real-time Earthquake Information into Large Enrollment Natural Disaster Course Learning (United States)

    Furlong, K. P.; Benz, H.; Hayes, G. P.; Villasenor, A.


    Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National Earthquake Information Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground

  7. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.


    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  8. Earthquakes: Natural Science Museum and Civil Protection of Trento to inform citizens (United States)

    Lauro, Claudia; Avanzini, Marco


    During 2009 the Natural Science Museum of Trento organized the exhibition "Attraction Earth: Earthquakes and Terrestrial Magnetism" in collaboration with the INGV (Italian National Institute of Geophysic and Volcanology). In this exhibition a particular sector has been devoted to the seismic activity and its monitoring in the Province of Trento. The purpose was to inform local people on the geological features of their territory, the monitoring activity carried out by the Civil Protection and the potential earthquake hazards, also in order to adopt a correct behaviour in case of seismic event. This sector, "The seismometric Trentino network", was organized by the Geological Service of the Trento Civil Protection and it is open till May 2010, both for general public and school students. For the latter, a particular education pack, realized by the Educational Department of the Museum and consisting of a guided tour coupled with the laboratory activity "Waves upside-down: seismology", is proposed. The whole exhibition has been also coupled with a cycle conferences targeted to adults, in which these topics have been explained by researchers and technicians of INGV and of Trento Geological Service. "The seismometric Trentino network" sector presents the daily monitoring activity of the Geological Service, that has been monitoring the seismic activity for the last 30 years, and describes the deep earth processes of the local territory, such as presence of tectonic discontinuities and their activity. It consists of display panels, a seismometer with rotating drums and a multimedia that reports the monitoring activity of the seismometric network, with real time connection to the various monitoring stations. This allows visitors to observe instantly the local seismic events recorded by each station. The seismometric network was established by the institutions of Trento Province after the earthquakes occurred in Friuli Venezia-Giulia and at Riva del Garda (1976). It started

  9. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure (United States)

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel


    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for



    Q. S. Jiao; Y. Luo; W. H. Shen; Q. Li; X. Wang


    Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV) and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented an...

  11. Information needs for the rapid response team electronic clinical tool. (United States)

    Barwise, Amelia; Caples, Sean; Jensen, Jeffrey; Pickering, Brian; Herasevich, Vitaly


    Information overload in healthcare is dangerous. It can lead to critical errors and delays. During Rapid Response Team (RRT) activations providers must make decisions quickly to rescue patients from physiological deterioration. In order to understand the clinical data required and how best to present that information in electronic systems we aimed to better assess the data needs of providers on the RRT when they respond to an event. A web based survey to evaluate clinical data requirements was created and distributed to all RRT providers at our institution. Participants were asked to rate the importance of each data item in guiding clinical decisions during a RRT event response. There were 96 surveys completed (24.5% response rate) with fairly even distribution throughout all clinical roles on the RRT. Physiological data including heart rate, respiratory rate, and blood pressure were ranked by more than 80% of responders as being critical information. Resuscitation status was also considered critically useful by more than 85% of providers. There is a limited dataset that is considered important during an RRT. The data is widely available in EMR. The findings from this study could be used to improve user-centered EMR interfaces.

  12. Rapid sampling of molecular motions with prior information constraints. (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan


    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  13. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh


    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  14. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity (United States)

    Zoeller, G.


    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  15. Twitter earthquake detection: Earthquake monitoring in a social world (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.


    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  16. An evaluation of Health of the Nation Outcome Scales data to inform psychiatric morbidity following the Canterbury earthquakes. (United States)

    Beaglehole, Ben; Frampton, Chris M; Boden, Joseph M; Mulder, Roger T; Bell, Caroline J


    Following the onset of the Canterbury, New Zealand earthquakes, there were widespread concerns that mental health services were under severe strain as a result of adverse consequences on mental health. We therefore examined Health of the Nation Outcome Scales data to see whether this could inform our understanding of the impact of the Canterbury earthquakes on patients attending local specialist mental health services. Health of the Nation Outcome Scales admission data were analysed for Canterbury mental health services prior to and following the Canterbury earthquakes. These findings were compared to Health of the Nation Outcome Scales admission data from seven other large District Health Boards to delineate local from national trends. Percentage changes in admission numbers were also calculated before and after the earthquakes for Canterbury and the seven other large district health boards. Admission Health of the Nation Outcome Scales scores in Canterbury increased after the earthquakes for adult inpatient and community services, old age inpatient and community services, and Child and Adolescent inpatient services compared to the seven other large district health boards. Admission Health of the Nation Outcome Scales scores for Child and Adolescent community services did not change significantly, while admission Health of the Nation Outcome Scales scores for Alcohol and Drug services in Canterbury fell compared to other large district health boards. Subscale analysis showed that the majority of Health of the Nation Outcome Scales subscales contributed to the overall increases found. Percentage changes in admission numbers for the Canterbury District Health Board and the seven other large district health boards before and after the earthquakes were largely comparable with the exception of admissions to inpatient services for the group aged 4-17 years which showed a large increase. The Canterbury earthquakes were followed by an increase in Health of the Nation

  17. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.


    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  18. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  19. Smartphone MEMS accelerometers and earthquake early warning (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.


    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  20. Exposure to rapid succession disasters: a study of residents at the epicenter of the Chilean Bío Bío earthquake. (United States)

    Garfin, Dana Rose; Silver, Roxane Cohen; Ugalde, Francisco Javier; Linn, Heiko; Inostroza, Manuel


    We examined cumulative and specific types of trauma exposure as predictors of distress and impairment following a multifaceted community disaster. Approximately 3 months after the 8.8 magnitude earthquake, tsunami, and subsequent looting in Bío Bío, Chile, face-to-face interviews were conducted in 5 provinces closest to the epicenter. Participants (N = 1,000) were randomly selected using military topographic records and census data. Demographics, exposure to discrete components of the disaster (earthquake, tsunami, looting), and exposure to secondary stressors (property loss, injury, death) were evaluated as predictors of posttraumatic stress (PTS) symptoms, global distress, and functional impairment. Prevalence of probable posttraumatic stress disorder was 18.95%. In adjusted models examining specificity of exposure to discrete disaster components and secondary stressors, PTS symptoms and global distress were associated with earthquake intensity, tsunami exposure, and injury to self/close other. Increased functional impairment correlated with earthquake intensity and injury to self/close other. In adjusted models, cumulative exposure to secondary stressors correlated with PTS symptoms, global distress, and functional impairment; cumulative count of exposure to discrete disaster components did not. Exploratory analyses indicated that, beyond direct exposure, appraising the tsunami and looting as the worst components of the disaster correlated with greater media exposure and higher socioeconomic status, respectively. Overall, threat to life indicators correlated with worse outcomes. As failure of government tsunami warnings resulted in many deaths, findings suggest disasters compounded by human errors may be particularly distressing. We advance theory regarding cumulative and specific trauma exposure as predictors of postdisaster distress and provide information for enhancing targeted postdisaster interventions. (c) 2014 APA, all rights reserved.

  1. Understanding Earthquakes (United States)

    Davis, Amanda; Gray, Ron


    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  2. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden


    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  3. Fast mapping rapidly integrates information into existing memory networks. (United States)

    Coutanche, Marc N; Thompson-Schill, Sharon L


    Successful learning involves integrating new material into existing memory networks. A learning procedure known as fast mapping (FM), thought to simulate the word-learning environment of children, has recently been linked to distinct neuroanatomical substrates in adults. This idea has suggested the (never-before tested) hypothesis that FM may promote rapid incorporation into cortical memory networks. We test this hypothesis here in 2 experiments. In our 1st experiment, we introduced 50 participants to 16 unfamiliar animals and names through FM or explicit encoding (EE) and tested participants on the training day, and again after sleep. Learning through EE produced strong declarative memories, without immediate lexical competition, as expected from slow-consolidation models. Learning through FM, however, led to almost immediate lexical competition, which continued to the next day. Additionally, the learned words began to prime related concepts on the day following FM (but not EE) training. In a 2nd experiment, we replicated the lexical integration results and determined that presenting an already-known item during learning was crucial for rapid integration through FM. The findings presented here indicate that learned items can be integrated into cortical memory networks at an accelerated rate through fast mapping. The retrieval of a related known concept, in order to infer the target of the FM question, is critical for this effect. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. A rapid extraction of landslide disaster information research based on GF-1 image (United States)

    Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na


    In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.

  5. Proposal as to Efficient Collection and Exploitation of Earthquake Damage Information and Verification by Field Experiment at Toyohashi City (United States)

    Zama, Shinsaku; Endo, Makoto; Takanashi, Ken'ichi; Araiba, Kiminori; Sekizawa, Ai; Hosokawa, Masafumi; Jeong, Byeong-Pyo; Hisada, Yoshiaki; Murakami, Masahiro

    Based on the earlier study result that the gathering of damage information can be quickly achieved in a municipality with a smaller population, it is proposed that damage information is gathered and analyzed using an area roughly equivalent to a primary school district as a basic unit. The introduction of this type of decentralized system is expected to quickly gather important information on each area. The information gathered by these communal disaster prevention bases is sent to the disaster prevention headquarters which in turn feeds back more extensive information over a wider area to the communal disaster prevention bases. Concrete systems have been developed according to the above mentioned framework, and we performed large-scale experiments on simulating disaster information collection, transmission and on utilization for smooth responses against earthquake disaster with collaboration from Toyohashi City, Aichi Prefecture, where is considered to suffer extensive damage from the Tokai and Tonankai Earthquakes with very high probability of the occurrence. Using disaster information collection/transmission equipments composed of long-distance wireless LAN, a notebook computer, a Web camera and an IP telephone, city staffs could easily input and transmit the information such as fire, collapsed houses and impassable roads, which were collected by the inhabitants participated in the experiment. Headquarters could confirm such information on the map automatically plotted, and also state of each disaster-prevention facility by means of Web-cameras and IP telephones. Based on the damage information, fire-spreading, evaluation, and traffic simulations were automatically executed at the disaster countermeasure office and their results were displayed on the large screen to utilize for making decisions such as residents' evacuation. These simulated results were simultaneously displayed at each disaster-prevention facility and were served to make people understand the

  6. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui


    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  7. Reviewing information support during the Great East Japan Earthquake disaster : From the perspective of a hospital library that received support (United States)

    Terasawa, Motoko

    The Great East Japan Earthquake of March 11, 2011 caused extensive damage over a widespread area. Our hospital library, which is located in the affected area, was no exception. A large collection of books was lost, and some web content was inaccessible due to damage to the network environment. This greatly hindered our efforts to continue providing post-disaster medical information services. Information support, such as free access to databases, journals, and other online content related to the disaster areas, helped us immensely during this time. We were fortunate to have the cooperation of various medical employees and library members via social networks, such as twitter, during the process of attaining this information support.

  8. The Advanced Rapid Imaging and Analysis (ARIA) Project: Status of SAR products for Earthquakes, Floods, Volcanoes and Groundwater-related Subsidence (United States)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.


    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano

  9. Earthquake Early Warning: User Education and Designing Effective Messages (United States)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.


    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  10. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang


    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  11. Information entropy of earthquake populations in northeastern Italy and western Slovenia (United States)

    Bressan, G.; Barnaba, C.; Gentili, S.; Rossi, G.


    The spatio-temporal evolution of eight seismicity populations, preceding and following moderate earthquake sequences occurred in NE-Italy and W-Slovenia, are investigated by means of the normalized Shannon entropy and the fractal dimension. Three phases are recognized in the temporal seismic series. The period preceding the mainshock is characterized by oscillations of the Shannon entropy around a nearly constant level and by fluctuations of the fractal dimension. The phase of mainshock and aftershock sequences is characterized by a significant decrease of the Shannon entropy. A simultaneous marked decrease of the fractal dimension is observed in five cases. After the sequence, the entropy recovers the nearly constant trend before the mainshock and the fractal dimension is characterized by fluctuations. We interpreted the fluctuations of the normalized Shannon entropy and the fractal dimension caused by the coupling between the stress field and the mechanical heterogeneities of the crust that results in spatial and temporal fluctuations of the strain energy.

  12. Rapid Seismic Deployment for Capturing Aftershocks of the September 2017 Tehuantepec, Mexico (M=8.1) and Morelos-Puebla (M=7.1), Mexico Earthquakes (United States)

    Velasco, A. A.; Karplus, M. S.; Dena, O.; Gonzalez-Huizar, H.; Husker, A. L.; Perez-Campos, X.; Calo, M.; Valdes, C. M.


    The September 7 Tehuantepec, Mexico (M=8.1) and the September 19 Morelos-Puebla, Mexico (M=7.1) earthquakes ruptured with extensional faulting within the Cocos Plate at 70-km and 50-km depth, as it subducts beneath the continental North American Plate. Both earthquakes caused significant damage and loss of life. These events were followed by a M=6.1 extensional earthquake at only 10-km depth in Oaxaca on September 23, 2017. While the Morelos-Puebla earthquake was likely too far away to be statically triggered by the Tehuantepec earthquake, initial Coulomb stress analyses show that the M=6.1 event may have been an aftershock of the Tehuantepec earthquake. Many questions remain about these earthquakes, including: Did the Cocos Plate earthquakes load the upper plate, and could they possibly trigger an equal or larger earthquake on the plate interface? Are these the result of plate bending? Do the aftershocks migrate to the locked zone in the subduction zone? Why did the intermediate depth earthquakes create so much damage? Are these earthquakes linked by dynamic stresses? Is it possible that a potential slow-slip event triggered both events? To address some of these questions, we deployed 10 broadband seismometers near the epicenter of the Tehuantepec, Mexico earthquake and 51 UTEP-owned nodes (5-Hz, 3-component geophones) to record aftershocks and augment networks deployed by the Universidad Nacional Autónoma de México (UNAM). The 10 broadband instruments will be deployed for 6 months, while the nodes were deployed 25 days. The relative ease-of-deployment and larger numbers of the nodes allowed us to deploy them quickly in the area near the M=6.1 Oaxaca earthquake, just a few days after that earthquake struck. We deployed them near the heavily-damaged cities of Juchitan, Ixtaltepec, and Ixtepec as well as in Tehuantepec and Salina Cruz, Oaxaca in order to test their capabilities for site characterization and aftershock studies. This is the first test of these

  13. Remote Sensing and Geographic Information Systems (GIS Contribution to the Inventory of Infrastructure Susceptible to Earthquake and Flooding Hazards in North-Eastern Greece

    Directory of Open Access Journals (Sweden)

    Ioanna Papadopoulou


    Full Text Available For civil protection reasons there is a strong need to improve the inventory of areas that are more vulnerable to earthquake ground motions or to earthquake-related secondary effects, such as landslides, liquefaction or soil amplifications. The use of remote sensing and Geographic Information Systems (GIS methods along with the related geo-databases can assist local and national authorities to be better prepared and organized. Remote sensing and GIS techniques are investigated in north-eastern Greece in order to contribute to the systematic, standardized inventory of those areas that are more susceptible to earthquake ground motions, to earthquake-related secondary effects and to tsunami-waves. Knowing areas with aggregated occurrence of causal (“negative” factors influencing earthquake shock and, thus, the damage intensity, this knowledge can be integrated into disaster preparedness and mitigation measurements. The evaluation of satellite imageries, digital topographic data and open source geodata contributes to the acquisition of the specific tectonic, geologic and geomorphologic settings influencing local site conditions in an area and, thus, estimate possible damage to be suffered.

  14. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.


    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  15. An interdisciplinary approach for earthquake modelling and forecasting (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.


    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  16. How to provide risk information. Based on citizen's evaluation of messages concerning radiation protection and anti-earthquake measures of nuclear power plant

    International Nuclear Information System (INIS)

    Tsuchiya, Tomoko; Kosugi, Motoko; Nakamura, Yasushi; Takahashi, Shigeaki; Harayama, Satoko


    The Framework for Nuclear Energy Policy in Japan, decided in 2005, requests implementation of risk communication to regain social trust on nuclear industry. Electric power companies, however, have few experiences of providing risk information as the first step of risk communication. This report analyzes which message is understandable, useful and trustworthy, by comparing two sets of different messages concerning radiation protection and anti-earthquake measures of nuclear power plant, respectively, based on interview survey for 30 people live in Tokyo metropolitan area. Participants in our survey evaluate the message about radiation protection including risk information is more reliable than one without risk information, but the former is less understandable and more anxious than the latter. In the case of messages regarding seismic measures of nuclear power plant, people are not satisfied with the argument which anti-earthquake measures are implemented, but want to know adequate grounds on which to admit that those measures are thorough. Another message of seismic measures we drafted contains basic knowledge about scales of earthquake, and shows that nuclear industry will consider bigger earthquake than ones in the past records, learn from past experience and improve their measures. 70% of participants assess this message more understandable, useful and trustworthy than the other to explain only seismic measures implemented. (author)

  17. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC (United States)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.


    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  18. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian


    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  19. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.


    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  20. Rapid Information and Communication Technology Assessment Team (RTAT): enabling the "hands and feet" to win the "hearts and minds"


    Beeson, R. Travis


    Approved for public release; distribution is unlimited Large-scale disasters severely damage local information and communication technology (ICT) infrastructure. This negatively impacts responders’ ability to communicate and collaborate with one another. As a result, humanitarian assistance (HA) response organizations cannot maintain situational awareness and efforts remain disjointed and inefficient. Out of the rubble of the Haiti earthquake, a cross-organizational collection of first res...

  1. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru. (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan


    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  2. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Directory of Open Access Journals (Sweden)

    Henny Rydberg

    Full Text Available Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity.We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001.Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  3. Using Rapid Ethnography to Support the Design and Implementation of Health Information Technologies. (United States)

    Ackerman, Sara; Gleason, Nathaniel; Gonzales, Ralph


    Ethnography is the defining practice - and art - of anthropology. Among health information technology (IT) developers, however, ethnography remains a little used and undervalued mode of inquiry and representation. In this chapter we demonstrate that ethnography can make important contributions to the design and implementation of more user-oriented health IT devices and systems. In particular, we propose 'rapid ethnography' as a pragmatic strategy that draws on classic ethnographic methods, but emphasizes shorter periods of fieldwork and quick turnaround of findings to inform (re)design, programming and implementation efforts. Rapid ethnography is theoretically and empirically situated in science and technology studies' explorations of a) the entanglement of social and technical dimensions of technology use; b) how getting tools to 'work' requires aligning interests across a wide range of human and non-human actors; and c) the ways in which humans and technology transform each other as they interact. We provide two detailed case studies to illustrate the evolution and uses of rapid ethnography at a U.S. academic medical center. By providing deeper insights into the experiences of users, and the contexts and communities in which new tools are introduced, rapid ethnography can serve as a valuable component of Techno-Anthropology and health IT innovation.

  4. MyShake: A smartphone seismic network for earthquake early warning and beyond. (United States)

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo


    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  5. Chance findings about early holocene tidal marshes of Grays Harbor, Washington, in relation to rapidly rising seas and great subduction earthquakes (United States)

    Phipps, James B.; Hemphill-Haley, Eileen; Atwater, Brian F.


    Tidal marshes commonly build upward apace with gradual rise in the level of the sea. It is expected, however, that few tidal marshes will keep up with accelerated sea-level rise later in this century. Tidal marshes have been drowned, moreover, after subsiding during earthquakes.

  6. Rapid humanitarian assessments and rationality: a value-of-information study from Iraq, 2003-04. (United States)

    Benini, Aldo; Conley, Charles


    Rapid assessments are one of the standard informational tools in humanitarian response and are supposed to contribute to rational decision-making.(1) The extent to which the assessment organisation itself behaves rationally, however, is an open question. This can be evaluated against multiple criteria, such as the cost and value of the information it collects and its ability to adapt flexibly design or samples when the survey environment changes unforeseeably. An unusual data constellation from two concurrent recent (2003-04) rapid assessments in northern Iraq permits us to model part of the actual assessment behaviour in terms of geographical, community and prior substantive information attributes. The model correctly predicts the decisions, in 79 per cent of the 2,425 local communities in focus, that data collector teams in the Emergency Mine Action Survey made to visit or not to visit. The analysis demonstrates variably rational behaviour under conditions of insecurity, repeated regrouping and incomplete sampling frames. A pronounced bias towards very small rural settlements is irrational for the overall results, but may be a rational strategy of individual survey workers seeking to prolong their employment. Implications for future assessments are sketched in the areas of tools for urban surveys, greater adaptability, including early feedback from users, and sensibility to value-of-information concepts.

  7. Leveraging geodetic data to reduce losses from earthquakes (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.


    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  8. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management (United States)

    Jaiswal, Kishor; Wald, David J.


    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  9. The HayWired Earthquake Scenario—Earthquake Hazards (United States)

    Detweiler, Shane T.; Wein, Anne M.


    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  10. Rapid and highly informative diagnostic assay for H5N1 influenza viruses.

    Directory of Open Access Journals (Sweden)

    Nader Pourmand

    Full Text Available A highly discriminative and information-rich diagnostic assay for H5N1 avian influenza would meet immediate patient care needs and provide valuable information for public health interventions, e.g., tracking of new and more dangerous variants by geographic area as well as avian-to-human or human-to-human transmission. In the present study, we have designed a rapid assay based on multilocus nucleic acid sequencing that focuses on the biologically significant regions of the H5N1 hemagglutinin gene. This allows the prediction of viral strain, clade, receptor binding properties, low- or high-pathogenicity cleavage site and glycosylation status. H5 HA genes were selected from nine known high-pathogenicity avian influenza subtype H5N1 viruses, based on their diversity in biologically significant regions of hemagglutinin and/or their ability to cause infection in humans. We devised a consensus pre-programmed pyrosequencing strategy, which may be used as a faster, more accurate alternative to de novo sequencing. The available data suggest that the assay described here is a reliable, rapid, information-rich and cost-effective approach for definitive diagnosis of H5N1 avian influenza. Knowledge of the predicted functional sequences of the HA will enhance H5N1 avian influenza surveillance efforts.

  11. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.


    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  12. Episodic Memory Retrieval Functionally Relies on Very Rapid Reactivation of Sensory Information. (United States)

    Waldhauser, Gerd T; Braun, Verena; Hanslmayr, Simon


    Episodic memory retrieval is assumed to rely on the rapid reactivation of sensory information that was present during encoding, a process termed "ecphory." We investigated the functional relevance of this scarcely understood process in two experiments in human participants. We presented stimuli to the left or right of fixation at encoding, followed by an episodic memory test with centrally presented retrieval cues. This allowed us to track the reactivation of lateralized sensory memory traces during retrieval. Successful episodic retrieval led to a very early (∼100-200 ms) reactivation of lateralized alpha/beta (10-25 Hz) electroencephalographic (EEG) power decreases in the visual cortex contralateral to the visual field at encoding. Applying rhythmic transcranial magnetic stimulation to interfere with early retrieval processing in the visual cortex led to decreased episodic memory performance specifically for items encoded in the visual field contralateral to the site of stimulation. These results demonstrate, for the first time, that episodic memory functionally relies on very rapid reactivation of sensory information. Remembering personal experiences requires a "mental time travel" to revisit sensory information perceived in the past. This process is typically described as a controlled, relatively slow process. However, by using electroencephalography to measure neural activity with a high time resolution, we show that such episodic retrieval entails a very rapid reactivation of sensory brain areas. Using transcranial magnetic stimulation to alter brain function during retrieval revealed that this early sensory reactivation is causally relevant for conscious remembering. These results give first neural evidence for a functional, preconscious component of episodic remembering. This provides new insight into the nature of human memory and may help in the understanding of psychiatric conditions that involve the automatic intrusion of unwanted memories. Copyright

  13. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.


    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  14. Rapid processing of data based on high-performance algorithms for solving inverse problems and 3D-simulation of the tsunami and earthquakes (United States)

    Marinin, I. V.; Kabanikhin, S. I.; Krivorotko, O. I.; Karas, A.; Khidasheli, D. G.


    We consider new techniques and methods for earthquake and tsunami related problems, particularly - inverse problems for the determination of tsunami source parameters, numerical simulation of long wave propagation in soil and water and tsunami risk estimations. In addition, we will touch upon the issue of database management and destruction scenario visualization. New approaches and strategies, as well as mathematical tools and software are to be shown. The long joint investigations by researchers of the Institute of Mathematical Geophysics and Computational Mathematics SB RAS and specialists from WAPMERR and Informap have produced special theoretical approaches, numerical methods, and software tsunami and earthquake modeling (modeling of propagation and run-up of tsunami waves on coastal areas), visualization, risk estimation of tsunami, and earthquakes. Algorithms are developed for the operational definition of the origin and forms of the tsunami source. The system TSS numerically simulates the source of tsunami and/or earthquakes and includes the possibility to solve the direct and the inverse problem. It becomes possible to involve advanced mathematical results to improve models and to increase the resolution of inverse problems. Via TSS one can construct maps of risks, the online scenario of disasters, estimation of potential damage to buildings and roads. One of the main tools for the numerical modeling is the finite volume method (FVM), which allows us to achieve stability with respect to possible input errors, as well as to achieve optimum computing speed. Our approach to the inverse problem of tsunami and earthquake determination is based on recent theoretical results concerning the Dirichlet problem for the wave equation. This problem is intrinsically ill-posed. We use the optimization approach to solve this problem and SVD-analysis to estimate the degree of ill-posedness and to find the quasi-solution. The software system we developed is intended to

  15. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.


    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  16. 2D Modelling of the Gorkha earthquake through the joint exploitation of Sentinel 1-A DInSAR measurements and geological, structural and seismological information (United States)

    De Novellis, Vincenzo; Castaldo, Raffaele; Solaro, Giuseppe; De Luca, Claudio; Pepe, Susi; Bonano, Manuela; Casu, Francesco; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Tizzani, Pietro


    A Mw 7.8 earthquake struck Nepal on 25 April 2015 at 06:11:26 UTC, killing more than 9,000 people, injuring more than 23,000 and producing extensive damages. The main seismic event, known as the Gorkha earthquake, had its epicenter localized at ~82 km NW of the Kathmandu city and the hypocenter at a depth of approximately 15 km. After the main shock event, about 100 aftershocks occurred during the following months, propagating toward the south-east direction; in particular, the most energetic shocks were the Mw 6.7 and Mw 7.3 occurred on 26 April and 12 May, respectively. In this study, we model the causative fault of the earthquake by jointly exploiting surface deformation retrieved by the DInSAR measurements collected through the Sentinel 1-A (S1A) space-borne sensor and the available geological, structural and seismological information. We first exploit the analytical solution performing a back-analysis of the ground deformation detected by the first co-seismic S1A interferogram, computed by exploiting the 17/04/2015 and 29/04/2015 SAR acquisitions and encompassing the main earthquake and some aftershocks, to search for the location and geometry of the fault plane. Starting from these findings and by benefiting from the available geological, structural and seismological data, we carry out a Finite Element (FE)-based 2D modelling of the causative fault, in order to evaluate the impact of the geological structures activated during the seismic event on the distribution of the ground deformation field. The obtained results show that the causative fault has a rather complex compressive structure, dipping northward, formed by segments with different dip angles: 6° the deep segment and 60° the shallower one. Therefore, although the hypocenters of the main shock and most of the more energetic aftershocks are located along the deeper plane, corresponding to a segment of the Main Himalayan Thrust (MHT), the FE solution also indicates the contribution of the shallower

  17. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.


    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  18. Design and application of the emergency response mobile phone-based information system for infectious disease reporting in the Wenchuan earthquake zone. (United States)

    Ma, Jiaqi; Zhou, Maigeng; Li, Yanfei; Guo, Yan; Su, Xuemei; Qi, Xiaopeng; Ge, Hui


    To describe the design and application of an emergency response mobile phone-based information system for infectious disease reporting. Software engineering and business modeling were used to design and develop the emergency response mobile phone-based information system for infectious disease reporting. Seven days after the initiation of the reporting system, the reporting rate in the earthquake zone reached the level of the same period in 2007, using the mobile phone-based information system. Surveillance of the weekly report on morbidity in the earthquake zone after the initiation of the mobile phone reporting system showed the same trend as the previous three years. The emergency response mobile phone-based information system for infectious disease reporting was an effective solution to transmit urgently needed reports and manage communicable disease surveillance information. This assured the consistency of disease surveillance and facilitated sensitive, accurate, and timely disease surveillance. It is an important backup for the internet-based direct reporting system for communicable disease. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.

  19. Rrsm: The European Rapid Raw Strong-Motion Database (United States)

    Cauzzi, C.; Clinton, J. F.; Sleeman, R.; Domingo Ballesta, J.; Kaestli, P.; Galanis, O.


    We introduce the European Rapid Raw Strong-Motion database (RRSM), a Europe-wide system that provides parameterised strong motion information, as well as access to waveform data, within minutes of the occurrence of strong earthquakes. The RRSM significantly differs from traditional earthquake strong motion dissemination in Europe, which has focused on providing reviewed, processed strong motion parameters, typically with significant delays. As the RRSM provides rapid open access to raw waveform data and metadata and does not rely on external manual waveform processing, RRSM information is tailored to seismologists and strong-motion data analysts, earthquake and geotechnical engineers, international earthquake response agencies and the educated general public. Access to the RRSM database is via a portal at that allows users to query earthquake information, peak ground motion parameters and amplitudes of spectral response; and to select and download earthquake waveforms. All information is available within minutes of any earthquake with magnitude ≥ 3.5 occurring in the Euro-Mediterranean region. Waveform processing and database population are performed using the waveform processing module scwfparam, which is integrated in SeisComP3 (SC3; Earthquake information is provided by the EMSC ( and all the seismic waveform data is accessed at the European Integrated waveform Data Archive (EIDA) at ORFEUS (, where all on-scale data is used in the fully automated processing. As the EIDA community is continually growing, the already significant number of strong motion stations is also increasing and the importance of this product is expected to also increase. Real-time RRSM processing started in June 2014, while past events have been processed in order to provide a complete database back to 2005.

  20. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking (United States)

    Allen, R. M.


    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast ( In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  1. Mitigating the consequences of future earthquakes in historical centres: what perspectives from the joined use of past information and geological-geophysical surveys? (United States)

    Terenzio Gizzi, Fabrizio; Moscatelli, Massimiliano; Potenza, Maria Rosaria; Zotta, Cinzia; Simionato, Maurizio; Pileggi, Domenico; Castenetto, Sergio


    To mitigate the damage effects of earthquakes in urban areas and particularly in historical centres prone to high seismic hazard is an important task to be pursued. As a matter of fact, seismic history throughout the world informs us that earthquakes have caused deep changes in the ancient urban conglomerations due to their high building vulnerability. Furthermore, some quarters can be exposed to an increase of seismic actions if compared with adjacent areas due to the geological and/or topographical features of the site on which the historical centres lie. Usually, the strategies aimed to estimate the local seismic hazard make only use of the geological-geophysical surveys. Thorough this approach we do not draw any lesson from what happened as a consequences of past earthquakes. With this in mind, we present the results of a joined use of historical data and traditional geological-geophysical approach to analyse the effects of possible future earthquakes in historical centres. The research activity discussed here is arranged into a joint collaboration between the Department of Civil Protection of the Presidency of Council of Ministers, the Institute of Environmental Geology and Geoengineering and the Institute of Archaeological and Monumental Heritage of the National (Italian) Research Council. In order to show the results, we discuss the preliminary achievements of the integrated study carried out on two historical towns located in Southern Apennines, a portion of the Italian peninsula exposed to high seismic hazard. Taking advantage from these two test sites, we also discuss some methodological implications that could be taken as a reference in the seismic microzonation studies.

  2. Incorporating rapid neocortical learning of new schema-consistent information into complementary learning systems theory. (United States)

    McClelland, James L


    The complementary learning systems theory of the roles of hippocampus and neocortex (McClelland, McNaughton, & O'Reilly, 1995) holds that the rapid integration of arbitrary new information into neocortical structures is avoided to prevent catastrophic interference with structured knowledge representations stored in synaptic connections among neocortical neurons. Recent studies (Tse et al., 2007, 2011) showed that neocortical circuits can rapidly acquire new associations that are consistent with prior knowledge. The findings challenge the complementary learning systems theory as previously presented. However, new simulations extending those reported in McClelland et al. (1995) show that new information that is consistent with knowledge previously acquired by a putatively cortexlike artificial neural network can be learned rapidly and without interfering with existing knowledge; it is when inconsistent new knowledge is acquired quickly that catastrophic interference ensues. Several important features of the findings of Tse et al. (2007, 2011) are captured in these simulations, indicating that the neural network model used in McClelland et al. has characteristics in common with neocortical learning mechanisms. An additional simulation generalizes beyond the network model previously used, showing how the rate of change of cortical connections can depend on prior knowledge in an arguably more biologically plausible network architecture. In sum, the findings of Tse et al. are fully consistent with the idea that hippocampus and neocortex are complementary learning systems. Taken together, these findings and the simulations reported here advance our knowledge by bringing out the role of consistency of new experience with existing knowledge and demonstrating that the rate of change of connections in real and artificial neural networks can be strongly prior-knowledge dependent. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  3. Rapid exposure and loss estimates for the May 12, 2008 Mw 7.9 Wenchuan earthquake provided by the U.S. Geological Survey's PAGER system (United States)

    Earle, P.S.; Wald, D.J.; Allen, T.I.; Jaiswal, K.S.; Porter, K.A.; Hearne, M.G.


    One half-hour after the May 12th Mw 7.9 Wenchuan, China earthquake, the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) system distributed an automatically generated alert stating that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater). It was immediately clear that a large-scale disaster had occurred. These alerts were widely distributed and referenced by the major media outlets and used by governments, scientific, and relief agencies to guide their responses. The PAGER alerts and Web pages included predictive ShakeMaps showing estimates of ground shaking, maps of population density, and a list of estimated intensities at impacted cities. Manual, revised alerts were issued in the following hours that included the dimensions of the fault rupture. Within a half-day, PAGER’s estimates of the population exposed to strong shaking levels stabilized at 5.2 million people. A coordinated research effort is underway to extend PAGER’s capability to include estimates of the number of casualties. We are pursuing loss models that will allow PAGER the flexibility to use detailed inventory and engineering results in regions where these data are available while also calculating loss estimates in regions where little is known about the type and strength of the built infrastructure. Prototype PAGER fatality estimates are currently implemented and can be manually triggered. In the hours following the Wenchuan earthquake, these models predicted fatalities in the tens of thousands.

  4. Geophysical Anomalies and Earthquake Prediction (United States)

    Jackson, D. D.


    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require


    Directory of Open Access Journals (Sweden)

    Mustafa ULAS


    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  6. Development of fragility functions to estimate homelessness after an earthquake (United States)

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann


    Immediately after an earthquake, many stakeholders need to make decisions about their response. These decisions often need to be made in a data poor environment as accurate information on the impact can take months or even years to be collected and publicized. Social fragility functions have been developed and applied to provide an estimate of the impact in terms of building damage, deaths and injuries in near real time. These rough estimates can help governments and response agencies determine what aid may be required which can improve their emergency response and facilitate planning for longer term response. Due to building damage, lifeline outages, fear of aftershocks, or other causes, people may become displaced or homeless after an earthquake. Especially in cold and dangerous locations, the rapid provision of safe emergency shelter can be a lifesaving necessity. However, immediately after an event there is little information available about the number of homeless, their locations and whether they require public shelter to aid the response agencies in decision making. In this research, we analyze homelessness after historic earthquakes using the CATDAT Damaging Earthquakes Database. CATDAT includes information on the hazard as well as the physical and social impact of over 7200 damaging earthquakes from 1900-2013 (Daniell et al. 2011). We explore the relationship of both earthquake characteristics and area characteristics with homelessness after the earthquake. We consider modelled variables such as population density, HDI, year, measures of ground motion intensity developed in Daniell (2014) over the time period from 1900-2013 as well as temperature. Using a base methodology based on that used for PAGER fatality fragility curves developed by Jaiswal and Wald (2010), but using regression through time using the socioeconomic parameters developed in Daniell et al. (2012) for "socioeconomic fragility functions", we develop a set of fragility curves that can be

  7. Earthquake number forecasts testing (United States)

    Kagan, Yan Y.


    and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.

  8. Assessment of earthquake effects - contribution from online communication (United States)

    D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline


    The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta ( The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.

  9. The 2014 Greeley, Colorado Earthquakes: Science, Industry, Regulation, and Media (United States)

    Yeck, W. L.; Sheehan, A. F.; Weingarten, M.; Nakai, J.; Ge, S.


    first time that such action had been taken by the COGCC. This presentation provides an overview of the interactions among academic researchers, industry, media, and regulators during the period of rapid response to this earthquake sequence, and the role of seismology in informing those responses.

  10. Rapid prototyping of soil moisture estimates using the NASA Land Information System (United States)

    Anantharaj, V.; Mostovoy, G.; Li, B.; Peters-Lidard, C.; Houser, P.; Moorhead, R.; Kumar, S.


    The Land Information System (LIS), developed at the NASA Goddard Space Flight Center, is a functional Land Data Assimilation System (LDAS) that incorporates a suite of land models in an interoperable computational framework. LIS has been integrated into a computational Rapid Prototyping Capabilities (RPC) infrastructure. LIS consists of a core, a number of community land models, data servers, and visualization systems - integrated in a high-performance computing environment. The land surface models (LSM) in LIS incorporate surface and atmospheric parameters of temperature, snow/water, vegetation, albedo, soil conditions, topography, and radiation. Many of these parameters are available from in-situ observations, numerical model analysis, and from NASA, NOAA, and other remote sensing satellite platforms at various spatial and temporal resolutions. The computational resources, available to LIS via the RPC infrastructure, support e- Science experiments involving the global modeling of land-atmosphere studies at 1km spatial resolutions as well as regional studies at finer resolutions. The Noah Land Surface Model, available with-in the LIS is being used to rapidly prototype soil moisture estimates in order to evaluate the viability of other science applications for decision making purposes. For example, LIS has been used to further extend the utility of the USDA Soil Climate Analysis Network of in-situ soil moisture observations. In addition, LIS also supports data assimilation capabilities that are used to assimilate remotely sensed soil moisture retrievals from the AMSR-E instrument onboard the Aqua satellite. The rapid prototyping of soil moisture estimates using LIS and their applications will be illustrated during the presentation.

  11. Lymphatic transport of exosomes as a rapid route of information dissemination to the lymph node. (United States)

    Srinivasan, Swetha; Vannberg, Fredrik O; Dixon, J Brandon


    It is well documented that cells secrete exosomes, which can transfer biomolecules that impact recipient cells' functionality in a variety of physiologic and disease processes. The role of lymphatic drainage and transport of exosomes is as yet unknown, although the lymphatics play critical roles in immunity and exosomes are in the ideal size-range for lymphatic transport. Through in vivo near-infrared (NIR) imaging we have shown that exosomes are rapidly transported within minutes from the periphery to the lymph node by lymphatics. Using an in vitro model of lymphatic uptake, we have shown that lymphatic endothelial cells actively enhanced lymphatic uptake and transport of exosomes to the luminal side of the vessel. Furthermore, we have demonstrated a differential distribution of exosomes in the draining lymph nodes that is dependent on the lymphatic flow. Lastly, through endpoint analysis of cellular distribution of exosomes in the node, we identified macrophages and B-cells as key players in exosome uptake. Together these results suggest that exosome transfer by lymphatic flow from the periphery to the lymph node could provide a mechanism for rapid exchange of infection-specific information that precedes the arrival of migrating cells, thus priming the node for a more effective immune response.

  12. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell


    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  13. The CATDAT damaging earthquakes database (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.


    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  14. Twitter Seismology: Earthquake Monitoring and Response in a Social World (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.


    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  15. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Aftershocks and Postseismic Effects (United States)

    Reasenberg, Paul A.


    While the damaging effects of the earthquake represent a significant social setback and economic loss, the geophysical effects have produced a wealth of data that have provided important insights into the structure and mechanics of the San Andreas Fault system. Generally, the period after a large earthquake is vitally important to monitor. During this part of the seismic cycle, the primary fault and the surrounding faults, rock bodies, and crustal fluids rapidly readjust in response to the earthquake's sudden movement. Geophysical measurements made at this time can provide unique information about fundamental properties of the fault zone, including its state of stress and the geometry and frictional/rheological properties of the faults within it. Because postseismic readjustments are rapid compared with corresponding changes occurring in the preseismic period, the amount and rate of information that is available during the postseismic period is relatively high. From a geophysical viewpoint, the occurrence of the Loma Prieta earthquake in a section of the San Andreas fault zone that is surrounded by multiple and extensive geophysical monitoring networks has produced nothing less than a scientific bonanza. The reports assembled in this chapter collectively examine available geophysical observations made before and after the earthquake and model the earthquake's principal postseismic effects. The chapter covers four broad categories of postseismic effect: (1) aftershocks; (2) postseismic fault movements; (3) postseismic surface deformation; and (4) changes in electrical conductivity and crustal fluids.

  16. Dating Informed Correlations and Large Earthquake Recurrence at the Hokuri Creek Paleoseismic Site, Alpine Fault, South Island, New Zealand (United States)

    Biasi, G. P.; Clark, K.; Berryman, K. R.; Cochran, U. A.; Prior, C.


    The Hokuri Creek paleoseismic site on the Alpine fault in south Westland, New Zealand has yielded a remarkable history of fault activity spanning the past ~7000 years. Evidence for earthquake occurrence and timing has been developed primarily from natural exposures created after a geologically major incision event a few hundred years ago. Prior to this event, the elevation of the spillway of Hokuri Creek into its previous drainage was controlled by NE translation of a shutter ridge during earthquakes. Each event increased the base level for sediment accumulation upstream by decimetres to perhaps a metre. Each increase in base level is associated with a period of accumulation principally of clean fine silts and rock flour. With infilling and time, the wetlands reestablish and sedimentation transitions to a slower and more organic-rich phase (Clark et al., this meeting). At least 18 such cycles have been identified at the site. Carbonaceous material is abundant in almost all layers. Much of the dating is done on macrofossils - individual beech tree leaves, reeds, and similar fragile features. Reworking is considered unlikely due to the fragility of the samples. All dates were developed by the Rafter Radiocarbon Laboratory of the National Isotope Centre at GNS. Delta 13C was measured and used to correct for fractionation. Dating earthquakes at the Hokuri Creek site presents some special challenges. Individual stratigraphic sections around the site expose different time intervals. The Main Section series provides the most complete single section, with over 5000 years of represented. Nearby auxiliary exposures cover nearly 1500 years more. Date series from individual exposures tend to be internally very consistent with stratigraphic ordering, but by virtue of their spatial separation, correlations between sections are more difficult. We find, however, that the distinctive layering and the typical 2-4 centuries between primary silt layers provides a way to cross

  17. Humanitarian information management and systems

    NARCIS (Netherlands)

    van de Walle, B.A.; van den Eede, G.G.P.; Muhren, W.J.; Loffler, J.; Klann, M.


    In times of major disasters such as hurricane Katrina or the Sichuan earthquake, the need for accurate and timely information is as crucial as is rapid and coherent coordination among the responding humanitarian community. Effective humanitarian information systems that provide timely access to

  18. ELER software - a new tool for urban earthquake loss assessment (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.


    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  19. Connecting slow earthquakes to huge earthquakes


    Obara, Kazushige; Kato, Aitaro


    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  20. What Can Sounds Tell Us About Earthquake Interactions? (United States)

    Aiken, C.; Peng, Z.


    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  1. Automated Determination of Magnitude and Source Length of Large Earthquakes (United States)

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.


    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  2. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev


    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  3. Diffusion of new technology, health services and information after a crisis: a focus group study of the Sichuan "5.12" Earthquake. (United States)

    Zhou, Hong; Shi, Lu; Mao, Yuping; Tang, Juan; Zeng, Yu


    The Sichuan "5.12" Earthquake in 2008 occurred in a relatively underdeveloped area in China. The rainy weather, the mountainous environment and the local languages all posed major challenges to the dissemination of information and services after the disaster. By adopting a communication perspective, this study applies the diffusion of innovations theory to investigate how healthcare professionals diffused health technologies, health information and services during the rescue and relief operation. The authors conducted three focus group sessions with the health professionals who had attended to the rescue and relief work of the Sichuan "5.12" Earthquake in 2008. A range of questions regarding the diffusion of innovations were asked during these sessions. The health professionals used their cell phones to communicate with other healthcare providers, disseminated knowledge of health risks and injuries to affected residents with pamphlets and posters and attended daily meetings at the local government offices. They reported on the shortage of maritime satellite cell phones and large-size tents for medical use, and the absence of fully equipped ambulances. Volunteers, local health professionals and local officials provided health information and services in different ways. However, the diffusion of health information and services was less likely to reach those living next to transportation centers, in remote areas and in disaster areas neglected by the media. New communication devices such as cell phones and the mobile Internet enabled medical professionals to coordinate the rescue and relief work after this major natural disaster, at a time when the country's emergency response system still had plenty of room for improvement. In future, the mobile Internet should be used as a means of collecting bottom-up disaster reports so that the media will not neglect any disaster areas as they did during the Sichuan Earthquake. Rescue relief work would have been substantially

  4. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes (United States)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.


    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  5. The rapid use of gender information: evidence of the time course of pronoun resolution from eyetracking. (United States)

    Arnold, J E; Eisenband, J G; Brown-Schmidt, S; Trueswell, J C


    Eye movements of listeners were monitored to investigate how gender information and accessibility influence the initial processes of pronoun interpretation. Previous studies on this issue have produced mixed results, and several studies have concluded that gender cues are not automatically used during the early processes of pronoun interpretation (e.g. Garnham, A., Oakhill, J. & Cruttenden, H. (1992). The role of implicit causality and gender cue in the interpretation of pronouns. Language and Cognitive Processes, 73 (4), 231-255; Greene, S. B., McKoon, G. & Ratcliff, R. (1992). Pronoun resolution and discourse models. Journal of Experimental Psychology: Learning, Memory, and Cognition, 182, 266-283). In the two experiments presented here, participants viewed a picture with two familiar cartoon characters of either same or different gender. They listened to a text describing the picture, in which a pronoun referred to either the first, more accessible, character, or the second. (For example, Donald is bringing some mail to ¿Mickey/Minnie¿ while a violent storm is beginning. He's carrying an umbrellaellipsis.) The results of both experiments show rapid use of both gender and accessibility at approximately 200 ms after the pronoun offset.

  6. Incidental learning during rapid information processing on the symbol-digit modalities test. (United States)

    Denney, Douglas R; Hughes, Abbey J; Elliott, Jacquelyn K; Roth, Alexandra K; Lynch, Sharon G


    The Symbol--Digit Modalities Test (SDMT) is widely used to assess processing speed in MS patients. We developed a computerized version of the SDMT (c-SDMT) that scored participants' performance during subintervals over the course of the usual 90-s time period and also added an incidental learning test (c-ILT) to assess how well participants learned the symbol-digit associations while completing the c-SDMT. Patients with MS (n = 65) achieved lower scores than healthy controls (n = 38) on both the c-SDMT and c-ILT, and the scores on the two tests were correlated. However, no increase in the rate of item completion occurred for either group over the course of the c-SDMT, and the difference between groups was the same during each subinterval. Therefore, it seems implausible that controls completed more items on the c-SDMT because they were more adept at learning the symbol-digit associations as the test ensued. Instead, MS patients' poorer incidental learning performance appears to reflect the greater attentional burden that tasks requiring rapid serial processing of information impose upon them. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail:

  7. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare. (United States)

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna


    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  8. Earthquake Facts (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  9. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A


    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  10. The Observation of Fault Finiteness and Rapid Velocity Variation in Pnl Waveforms for the Mw 6.5, San Simeon, California Earthquake (United States)

    Konca, A. O.; Ji, C.; Helmberger, D. V.


    We observed the effect of the fault finiteness in the Pnl waveforms from regional distances (4° to 12° ) for the Mw6.5 San Simeon Earthquake on 22 December 2003. We aimed to include more of the high frequencies (2 seconds and longer periods) than the studies that use regional data for focal solutions (5 to 8 seconds and longer periods). We calculated 1-D synthetic seismograms for the Pn_l portion for both a point source, and a finite fault solution. The comparison of the point source and finite fault waveforms with data show that the first several seconds of the point source synthetics have considerably higher amplitude than the data, while finite fault does not have a similar problem. This can be explained by reversely polarized depth phases overlapping with the P waves from the later portion of the fault, and causing smaller amplitudes for the beginning portion of the seismogram. This is clearly a finite fault phenomenon; therefore, can not be explained by point source calculations. Moreover, the point source synthetics, which are calculated with a focal solution from a long period regional inversion, are overestimating the amplitude by three to four times relative to the data amplitude, while finite fault waveforms have the similar amplitudes to the data. Hence, a moment estimation based only on the point source solution of the regional data could have been wrong by half of magnitude. We have also calculated the shifts of synthetics relative to data to fit the seismograms. Our results reveal that the paths from Central California to the south are faster than to the paths to the east and north. The P wave arrival to the TUC station in Arizona is 4 seconds earlier than the predicted Southern California model, while most stations to the east are delayed around 1 second. The observed higher uppermost mantle velocities to the south are consistent with some recent tomographic models. Synthetics generated with these models significantly improves the fits and the

  11. Using Rapid Improvement Events for Disaster After-Action Reviews: Experience in a Hospital Information Technology Outage and Response. (United States)

    Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly


    The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.

  12. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina


    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  13. Proposal of resolution to create an inquiry commission on the french nuclear power plants reliability in case or earthquakes and on the safety, information and warning procedures in case of incidents

    International Nuclear Information System (INIS)


    This short paper presents the reasons of the creation of parliamentary inquiry commission of 30 members, on the reliability of the nuclear power plants in France in case of earthquakes and on the safety, information and warning procedures in case of accidents. (A.L.B.)

  14. Defeating Earthquakes (United States)

    Stein, R. S.


    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  15. Earthquake Early Warning Systems


    Pei-Yang Lin


    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  16. Can mobile phone technology support a rapid sharing of information on novel psychoactive substances among health and other professionals internationally? (United States)

    Simonato, Pierluigi; Bersani, Francesco S; Santacroce, Rita; Cinosi, Eduardo; Schifano, Fabrizio; Bersani, Giuseppe; Martinotti, Giovanni; Corazza, Ornella


    The diffusion of novel psychoactive substances (NPSs), combined with the ability of the Internet to act as an online marketplace, has led to unprecedented challenges for governments, health agencies, and substance misuse services. Despite increasing research, there is a paucity of reliable information available to professionals working in the field. The paper will present the pilot results of the first mobile application (SMAIL) for rapid information sharing on NPSs among health professionals. The development of SMAIL was divided into 2 parts: (a) the creation of the application for registered users, enabling them to send an SMS or email with the name or "street name" of an NPS and receive within seconds emails or SMS with the information, when available and (b) the development of a database to support the incoming requests. One hundred twenty-two professionals based in 22 countries used the service over the pilot period of 16 months (from May 2012 to September 2013). Five hundred fifty-seven enquires were made. Users received rapid information on NPSs, and 61% of them rated the service as excellent. This is the right time to use mobile phone technologies for rapid information sharing and prevention activities on NPSs. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Fiscal 2000 research report on the research on earthquake disaster prevention technology for industrial machinery systems; 2000 nendo sangyo kikai system no boshin bosai gijutsu no chosa kenkyu hokokusho

    Energy Technology Data Exchange (ETDEWEB)



    Available technologies were extracted and technical problems were discussed in detail in connection with the above-named technology, and a technology development scenario was prepared. The study covered the subjects of an earthquake resistant disaster prevention system, ready for activation upon earthquake occurrence, with its structure designed to counter severe vibration; a real-time earthquake resistant disaster prevention system capable of ensuring system safety upon receiving earthquake occurrence information and of instantly collecting information on damage incurred; and an early restoration system to operate upon termination of earthquake. With the active utilization in mind of information technology now making a rapid progress, importance was stressed of a system under which earthquake information, disaster prevention networks, and information on the soil and geography would be linked to the database of the equipment involved. For research on the current state of earthquake disaster prevention technology, an on-site survey was conducted of the disaster prevention facilities now under construction in Hyogo Prefecture, and another survey was conducted of Shizuoka Prefecture's long-standing consideration of earthquake disaster prevention measures. Data were collected at the 5th Corporate Disaster Prevention Symposium held in San Jose, U.S. (NEDO)

  18. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization (United States)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.


    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  19. Rapid Response Fault Drilling Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    Demian M. Saffer


    Full Text Available New information about large earthquakes can be acquired by drilling into the fault zone quickly following a large seismic event. Specifically, we can learn about the levels of friction and strength of the fault which determine the dynamic rupture, monitor the healing process of the fault, record the stress changes that trigger aftershocks and capture important physical and chemical properties of the fault that control the rupture process. These scientific and associated technical issues were the focus of a three-day workshop on Rapid Response Fault Drilling: Past, Present, and Future, sponsored by the International Continental Scientific Drilling Program (ICDP and the Southern California Earthquake Center (SCEC. The meeting drewtogether forty-four scientists representing ten countries in Tokyo, Japan during November 2008. The group discussed the scientific problems and how they could be addressed through rapid response drilling. Focused talks presented previous work on drilling after large earthquakes and in fault zones in general, as well as the state of the art of experimental techniques and measurement strategies. Detailed discussion weighed the tradeoffs between rapid drilling andthe ability to satisfy a diverse range of scientific objectives. Plausible drilling sites and scenarios were evaluated. This is a shortened summary of the workshop report that discusses key scientific questions, measurement strategies, and recommendations. This report can provide a starting point for quickly mobilizing a drilling program following future large earthquakes. The full report can be seen at

  20. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake. (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi


    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  1. Initiation process of earthquakes and its implications for seismic hazard reduction strategy. (United States)

    Kanamori, H


    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  2. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat


    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  3. Application of τc*Pd in earthquake early warning (United States)

    Huang, Po-Lun; Lin, Ting-Li; Wu, Yih-Min


    Rapid assessment of damage potential and size of an earthquake at the station is highly demanded for onsite earthquake early warning. We study the application of τc*Pd for its estimation on the earthquake size using 123 events recorded by the borehole stations of KiK-net in Japan. The new type of earthquake size determined by τc*Pd is more related to the damage potential. We find that τc*Pd provides another parameter to measure the size of earthquake and the threshold to warn strong ground motion.

  4. Earthquake forewarning in the Cascadia region (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.


    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  5. Tweeting Earthquakes using TensorFlow (United States)

    Casarotti, E.; Comunello, F.; Magnoni, F.


    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  6. Thermal infrared anomalies of several strong earthquakes. (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying


    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  7. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study


    Neale, Chris; Johnston, Patrick; Hughes, Matthew; Scholey, Andrew


    The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore...

  8. Modeling of a historical earthquake in Erzincan, Turkey (Ms 7.8, in 1939) using regional seismological information obtained from a recent event (United States)

    Karimzadeh, Shaghayegh; Askan, Aysegul


    Located within a basin structure, at the conjunction of North East Anatolian, North Anatolian and Ovacik Faults, Erzincan city center (Turkey) is one of the most hazardous regions in the world. Combination of the seismotectonic and geological settings of the region has resulted in series of significant seismic activities including the 1939 (Ms 7.8) as well as the 1992 (Mw = 6.6) earthquakes. The devastative 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms. Thus, a limited number of studies exist on that earthquake. However, the 1992 event, despite the sparse local network at that time, has been studied extensively. This study aims to simulate the 1939 Erzincan earthquake using available regional seismic and geological parameters. Despite several uncertainties involved, such an effort to quantitatively model the 1939 earthquake is promising, given the historical reports of extensive damage and fatalities in the area. The results of this study are expressed in terms of anticipated acceleration time histories at certain locations, spatial distribution of selected ground motion parameters and felt intensity maps in the region. Simulated motions are first compared against empirical ground motion prediction equations derived with both local and global datasets. Next, anticipated intensity maps of the 1939 earthquake are obtained using local correlations between peak ground motion parameters and felt intensity values. Comparisons of the estimated intensity distributions with the corresponding observed intensities indicate a reasonable modeling of the 1939 earthquake.

  9. Rapid Ethical Assessment on Informed Consent Content and Procedure in Hintalo-Wajirat, Northern Ethiopia: A Qualitative Study.

    Directory of Open Access Journals (Sweden)

    Serebe Abay

    Full Text Available Informed consent is a key component of bio-medical research involving human participants. However, obtaining informed consent is challenging in low literacy and resource limited settings. Rapid Ethical Assessment (REA can be used to contextualize and simplify consent information within a given study community. The current study aimed to explore the effects of social, cultural, and religious factors during informed consent process on a proposed HPV-serotype prevalence study.A qualitative community-based REA was conducted in Adigudom and Mynebri Kebeles, Northern Ethiopia, from July to August 2013. Data were collected by a multi-disciplinary team using open ended questions concerning informed consent components in relation to the parent study. The team conducted one-to-one In-Depth Interviews (IDI and Focus Group Discussions (FGDs with key informants and community members to collect data based on the themes of the study. Tape recorded data were transcribed in Tigrigna and then translated into English. Data were categorized and thematically analyzed using open coding and content analysis based on pre-defined themes.The REA study revealed a number of socio-cultural issues relevant to the proposed study. Low community awareness about health research, participant rights and cervical cancer were documented. Giving a vaginal sample for testing was considered to be highly embarrassing, whereas giving a blood sample made participants worry that they might be given a result without the possibility of treatment. Verbal consent was preferred to written consent for the proposed study.This rapid ethical assessment disclosed important socio-cultural issues which might act as barriers to informed decision making. The findings were important for contextual modification of the Information Sheet, and to guide the best consent process for the proposed study. Both are likely to have enabled participants to understand the informed consent better and consequently to

  10. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC) (United States)


    ... DEPARTMENT OF THE INTERIOR Geological Survey [USGS-GX12GG00995NP00] National Earthquake Prediction... meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... National Earthquake Information Center (NEIC), 1711 Illinois Avenue, Golden, Colorado 80401. The Council is...

  11. Delays in using chromatic and luminance information to correct rapid reaches. (United States)

    Kane, Adam; Wade, Alex; Ma-Wyatt, Anna


    People can use feedback to make online corrections to movements but only if there is sufficient time to integrate the new information and make the correction. A key variable in this process is therefore the speed at which the new information about the target location is coded. Conduction velocities for chromatic signals are lower than for achromatic signals so it may take longer to correct reaches to chromatic stimuli. In addition to this delay, the sensorimotor system may prefer achromatic information over the chromatic information as delayed information may be less valuable when movements are made under time pressure. A down-weighting of chromatic information may result in additional latencies for chromatically directed reaches. In our study, participants made online corrections to reaches to achromatic, (L-M)-cone, and S-cone stimuli. Our chromatic stimuli were carefully adjusted to minimize stimulation of achromatic pathways, and we equated stimuli both in terms of detection thresholds and also by their estimated neural responses. Similar stimuli were used throughout the subjective adjustments and final reaching experiment. Using this paradigm, we found that responses to achromatic stimuli were only slightly faster than responses to (L-M)-cone and S-cone stimuli. We conclude that the sensorimotor system treats chromatic and achromatic information similarly and that the delayed chromatic responses primarily reflect early conduction delays.

  12. Invasive species information networks: Collaboration at multiple scales for prevention, early detection, and rapid response to invasive alien species (United States)

    Simpson, Annie; Jarnevich, Catherine S.; Madsen, John; Westbrooks, Randy G.; Fournier, Christine; Mehrhoff, Les; Browne, Michael; Graham, Jim; Sellers, Elizabeth A.


    Accurate analysis of present distributions and effective modeling of future distributions of invasive alien species (IAS) are both highly dependent on the availability and accessibility of occurrence data and natural history information about the species. Invasive alien species monitoring and detection networks (such as the Invasive Plant Atlas of New England and the Invasive Plant Atlas of the MidSouth) generate occurrence data at local and regional levels within the United States, which are shared through the US National Institute of Invasive Species Science. The Inter-American Biodiversity Information Network's Invasives Information Network (I3N), facilitates cooperation on sharing invasive species occurrence data throughout the Western Hemisphere. The I3N and other national and regional networks expose their data globally via the Global Invasive Species Information Network (GISIN). International and interdisciplinary cooperation on data sharing strengthens cooperation on strategies and responses to invasions. However, limitations to effective collaboration among invasive species networks leading to successful early detection and rapid response to invasive species include: lack of interoperability; data accessibility; funding; and technical expertise. This paper proposes various solutions to these obstacles at different geographic levels and briefly describes success stories from the invasive species information networks mentioned above. Using biological informatics to facilitate global information sharing is especially critical in invasive species science, as research has shown that one of the best indicators of the invasiveness of a species is whether it has been invasive elsewhere. Data must also be shared across disciplines because natural history information (e.g. diet, predators, habitat requirements, etc.) about a species in its native range is vital for effective prevention, detection, and rapid response to an invasion. Finally, it has been our

  13. Lessons of L'Aquila for Operational Earthquake Forecasting (United States)

    Jordan, T. H.


    and failures-to-predict. The best way to achieve this separation is to use probabilistic rather than deterministic statements in characterizing short-term changes in seismic hazards. The ICEF recommended establishing OEF systems that can provide the public with open, authoritative, and timely information about the short-term probabilities of future earthquakes. Because the public needs to be educated into the scientific conversation through repeated communication of probabilistic forecasts, this information should be made available at regular intervals, during periods of normal seismicity as well as during seismic crises. In an age of nearly instant information and high-bandwidth communication, public expectations regarding the availability of authoritative short-term forecasts are rapidly evolving, and there is a greater danger that information vacuums will spawn informal predictions and misinformation. L'Aquila demonstrates why the development of OEF capabilities is a requirement, not an option.

  14. Earthquake Early Warning: A Prospective User's Perspective (Invited) (United States)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.


    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  15. GeoNet's `Felt Rapid': Collecting What Is Needed, When You Need It, No More, No Less. Rapid, Volumous Data For Response Versus Detailed, Precise Data For Research (United States)

    Little, C. L.; McBride, S.; Balfour, N.


    New Zealand's geohazard monitoring agency, GeoNet, recently implemented `Felt Rapid': earthquake felt reporting that is quick and simple. GeoNet locates 20,000 earthquakes each year with hundreds of those reported as being felt. Starting in the late 1800s, the New Zealand public has become adept at completing felt reports but feedback since the Canterbury Earthquake Sequence suggested that traditional felt reporting was not meeting researchers' or the public's needs. GeoNet required something rapid, adaptable and robust. The solution was Felt Rapid, a mobile app and website where respondents simply pick from 6 cartoon images - representing Modified Mercalli Intensity (MMI) 3-8 - that best aligned to what they felt. For the last decade, felt reporting has been conducted via the GeoNet website, with additional targeted surveys after damaging earthquakes. The vast majority of the submitted felt reports were for earthquakes too small to cause damage, as these are by far the most frequent. Reports from small events are of little interest to researchers who are only concerned with damaging, MMI6 and above. However, we found that when damaging earthquakes did occur, such as Christchurch's M6.3, they were only sparsely reported (3,776 reports). Understandably, sitting at a computer and completing a lengthy online form wasn't a priority for people after a devastating earthquake. With Felt Rapid, reporting has to be completed within an hour of an earthquake, the use of GeoNet's automatically compiled felt reporting maps had evolved; their main purpose is immediate assessment of an earthquake's impact on populations, and is used by Civil Defence agencies. Reports are immediately displayed on an interactive map via the website and mobile app. With over 250,000 users this provides rapid and robust information regarding the experienced shaking. When a damaging earthquake occurs and researchers want to collect important and rare damaging felt reports, a separate in-depth survey

  16. Small discussion of electromagnetic wave anomalies preceding earthquakes

    Energy Technology Data Exchange (ETDEWEB)


    Six brief pieces on various aspects of electromagnetic wave anomalies are presented. They cover: earthquake electromagnetic emanations; the use of magnetic induction information for earthquake forecasting; electromagnetic pulse emissions as pre-earthquake indicators; the use of magnetic sensors to determine medium-wavelength field strength for earthquake prediction purposes; magnetic deviation indicators inside reinforced-concrete buildings; and a discussion of the general physical principles involved.

  17. Evidence of Rapid Modulation by Social Information of Subjective, Physiological, and Neural Responses to Emotional Expressions. (United States)

    Mermillod, Martial; Grynberg, Delphine; Pio-Lopez, Léo; Rychlowska, Magdalena; Beffara, Brice; Harquel, Sylvain; Vermeulen, Nicolas; Niedenthal, Paula M; Dutheil, Frédéric; Droit-Volet, Sylvie


    Recent research suggests that conceptual or emotional factors could influence the perceptual processing of stimuli. In this article, we aimed to evaluate the effect of social information (positive, negative, or no information related to the character of the target) on subjective (perceived and felt valence and arousal), physiological (facial mimicry) as well as on neural (P100 and N170) responses to dynamic emotional facial expressions (EFE) that varied from neutral to one of the six basic emotions. Across three studies, the results showed reduced ratings of valence and arousal of EFE associated with incongruent social information (Study 1), increased electromyographical responses (Study 2), and significant modulation of P100 and N170 components (Study 3) when EFE were associated with social (positive and negative) information (vs. no information). These studies revealed that positive or negative social information reduces subjective responses to incongruent EFE and produces a similar neural and physiological boost of the early perceptual processing of EFE irrespective of their congruency. In conclusion, the article suggests that the presence of positive or negative social context modulates early physiological and neural activity preceding subsequent behavior.

  18. Evidence of Rapid Modulation by Social Information of Subjective, Physiological, and Neural Responses to Emotional Expressions

    Directory of Open Access Journals (Sweden)

    Martial Mermillod


    Full Text Available Recent research suggests that conceptual or emotional factors could influence the perceptual processing of stimuli. In this article, we aimed to evaluate the effect of social information (positive, negative, or no information related to the character of the target on subjective (perceived and felt valence and arousal, physiological (facial mimicry as well as on neural (P100 and N170 responses to dynamic emotional facial expressions (EFE that varied from neutral to one of the six basic emotions. Across three studies, the results showed reduced ratings of valence and arousal of EFE associated with incongruent social information (Study 1, increased electromyographical responses (Study 2, and significant modulation of P100 and N170 components (Study 3 when EFE were associated with social (positive and negative information (vs. no information. These studies revealed that positive or negative social information reduces subjective responses to incongruent EFE and produces a similar neural and physiological boost of the early perceptual processing of EFE irrespective of their congruency. In conclusion, the article suggests that the presence of positive or negative social context modulates early physiological and neural activity preceding subsequent behavior.

  19. Determining Earthquake Susceptible Areas Southeast of Yogyakarta, Indonesia—Outcrop Analysis from Structure from Motion (SfM and Geographic Information System (GIS

    Directory of Open Access Journals (Sweden)

    Aditya Saputra


    Full Text Available Located approximately a hundred kilometres north of Java Subduction Zone, Java Island has a complicated geology and geomorphology. The north zone is dominated by the folded area, the centre is dominated by the active volcanic arc and the south of Java including the study area (Southeast part of Yogyakarta City, is dominated by the uplifted southern mountain. In general, the study area is part of the Bantul’s Graben. In the middle part of study area flows the Opak River, which is often associated with normal faults of Opak Fault. The Opak Fault is such a complex fault system which has a complex local fault which can cause worst local site effect when earthquakes occur. However, the geology map of Yogyakarta is the only data that gives the characteristics of Opak Fault roughly. Thus, the effort to identify unchartered fault system needs to be done. The aims of this study are to conduct the outcrop study, to identify the micro faults and to improve the understanding of faults system to support the earthquake hazard and risk assessment. The integrated method of remote sensing, structure from motion (SfM, geographic information system (GIS and direct outcrop observation was conducted in the study area. Remote sensing was applied to recognize the outcrop location and to extract the nature lineament feature which can be used as fault indicator. The structure from motion was used to support characterising the outcrop in the field, to identify the fault evidence, and to measure the fault displacement on the outcrops. The direct outcrop observation is very useful to reveal the lithofacies characteristics and to reconstruct the lithostratigraphic correlation among the outcrops. Meanwhile, GIS was used to analyse all the data from remote sensing, SfM, and direct outcrop observation. The main findings of this study were as follows: the middle part of study area has the most complicated geologic structure. At least 56 faults evidence with the maximum

  20. Toward real-time regional earthquake simulation of Taiwan earthquakes (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.


    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica ( Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  1. Earthquake likelihood model testing (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.


    wide range of possible testing procedures exist. Jolliffe and Stephenson (2003) present different forecast verifications from atmospheric science, among them likelihood testing of probability forecasts and testing the occurrence of binary events. Testing binary events requires that for each forecasted event, the spatial, temporal and magnitude limits be given. Although major earthquakes can be considered binary events, the models within the RELM project express their forecasts on a spatial grid and in 0.1 magnitude units; thus the results are a distribution of rates over space and magnitude. These forecasts can be tested with likelihood tests.In general, likelihood tests assume a valid null hypothesis against which a given hypothesis is tested. The outcome is either a rejection of the null hypothesis in favor of the test hypothesis or a nonrejection, meaning the test hypothesis cannot outperform the null hypothesis at a given significance level. Within RELM, there is no accepted null hypothesis and thus the likelihood test needs to be expanded to allow comparable testing of equipollent hypotheses.To test models against one another, we require that forecasts are expressed in a standard format: the average rate of earthquake occurrence within pre-specified limits of hypocentral latitude, longitude, depth, magnitude, time period, and focal mechanisms. Focal mechanisms should either be described as the inclination of P-axis, declination of P-axis, and inclination of the T-axis, or as strike, dip, and rake angles. Schorlemmer and Gerstenberger (2007, this issue) designed classes of these parameters such that similar models will be tested against each other. These classes make the forecasts comparable between models. Additionally, we are limited to testing only what is precisely defined and consistently reported in earthquake catalogs. Therefore it is currently not possible to test such information as fault rupture length or area, asperity location, etc. Also, to account

  2. Present status and future development of the European Community rapid information system

    International Nuclear Information System (INIS)

    Fraser, G.


    Following the Chernobyl reactor accident it was rapidly appreciated that, in addition to upgrading national radiological monitoring systems, action was required to facilitate international communication of the results obtained. The first such system was established by the Vienna Convention, drawn up under the auspices of the IAEA, which came into force in September, 1986. Subsequently the EC Council of Ministers decided in December, 1987, to set up a Community system which in many ways parallels that established by the Convention but differs significantly in certain aspects concerning its legal basis, initiation criteria, data provisions and communications requirements. The present paper describes the present status of the Community system and foreseeable future developments. It is a matter of policy that, to avoid unnecessary complications, this system should be, to the maximum extent practicable, fully compatible with that established by the Convention. Where appropriate, therefore, reference is also made to the latter system

  3. Designing Financial Instruments for Rapid Flood Response Using Remote Sensed and Archival Hazard and Exposure Information (United States)

    Lall, U.; Allaire, M.; Ceccato, P.; Haraguchi, M.; Cian, F.; Bavandi, A.


    Catastrophic floods can pose a significant challenge for response and recovery. A key bottleneck in the speed of response is the availability of funds to a country or regions finance ministry to mobilize resources. Parametric instruments, where the release of funs is tied to the exceedance of a specified index or threshold, rather than to loss verification are well suited for this purpose. However, designing and appropriate index, that is not subject to manipulation and accurately reflects the need is a challenge, especially in developing countries which have short hydroclimatic and loss records, and where rapid land use change has led to significant changes in exposure and hydrology over time. The use of long records of rainfall from climate re-analyses, flooded area and land use from remote sensing to design and benchmark a parametric index considering the uncertainty and representativeness of potential loss is explored with applications to Bangladesh and Thailand. Prospects for broader applicability and limitations are discussed.

  4. A rapid assessment scorecard to identify informal settlements at higher maternal and child health risk in Mumbai. (United States)

    Osrin, David; Das, Sushmita; Bapat, Ujwala; Alcock, Glyn A; Joshi, Wasundhara; More, Neena Shah


    The communities who live in urban informal settlements are diverse, as are their environmental conditions. Characteristics include inadequate access to safe water and sanitation, poor quality of housing, overcrowding, and insecure residential status. Interventions to improve health should be equity-driven and target those at higher risk, but it is not clear how to prioritise informal settlements for health action. In implementing a maternal and child health programme in Mumbai, India, we had conducted a detailed vulnerability assessment which, though important, was time-consuming and may have included collection of redundant information. Subsequent data collection allowed us to examine three issues: whether community environmental characteristics were associated with maternal and newborn healthcare and outcomes; whether it was possible to develop a triage scorecard to rank the health vulnerability of informal settlements based on a few rapidly observable characteristics; and whether the scorecard might be useful for future prioritisation. The City Initiative for Newborn Health documented births in 48 urban slum areas over 2 years. Information was collected on maternal and newborn care and mortality, and also on household and community environment. We selected three outcomes-less than three antenatal care visits, home delivery, and neonatal mortality-and used logistic regression and classification and regression tree analysis to test their association with rapidly observable environmental characteristics. We developed a simple triage scorecard and tested its utility as a means of assessing maternal and newborn health risk. In analyses on a sample of 10,754 births, we found associations of health vulnerability with inadequate access to water, toilets, and electricity; non-durable housing; hazardous location; and rental tenancy. A simple scorecard based on these had limited sensitivity and positive predictive value, but relatively high specificity and negative

  5. A rapid estimation of near field tsunami run-up (United States)

    Riqueime, Sebastian; Fuentes, Mauricio; Hayes, Gavin; Campos, Jamie


    Many efforts have been made to quickly estimate the maximum run-up height of tsunamis associated with large earthquakes. This is a difficult task, because of the time it takes to construct a tsunami model using real time data from the source. It is possible to construct a database of potential seismic sources and their corresponding tsunami a priori.However, such models are generally based on uniform slip distributions and thus oversimplify the knowledge of the earthquake source. Here, we show how to predict tsunami run-up from any seismic source model using an analytic solution, that was specifically designed for subduction zones with a well defined geometry, i.e., Chile, Japan, Nicaragua, Alaska. The main idea of this work is to provide a tool for emergency response, trading off accuracy for speed. The solutions we present for large earthquakes appear promising. Here, run-up models are computed for: The 1992 Mw 7.7 Nicaragua Earthquake, the 2001 Mw 8.4 Perú Earthquake, the 2003Mw 8.3 Hokkaido Earthquake, the 2007 Mw 8.1 Perú Earthquake, the 2010 Mw 8.8 Maule Earthquake, the 2011 Mw 9.0 Tohoku Earthquake and the recent 2014 Mw 8.2 Iquique Earthquake. The maximum run-up estimations are consistent with measurements made inland after each event, with a peak of 9 m for Nicaragua, 8 m for Perú (2001), 32 m for Maule, 41 m for Tohoku, and 4.1 m for Iquique. Considering recent advances made in the analysis of real time GPS data and the ability to rapidly resolve the finiteness of a large earthquake close to existing GPS networks, it will be possible in the near future to perform these calculations within the first minutes after the occurrence of similar events. Thus, such calculations will provide faster run-up information than is available from existing uniform-slip seismic source databases or past events of pre-modeled seismic sources.

  6. A Method for Estimation of Death Tolls in Disastrous Earthquake (United States)

    Pai, C.; Tien, Y.; Teng, T.


    Fatality tolls caused by the disastrous earthquake are the one of the most important items among the earthquake damage and losses. If we can precisely estimate the potential tolls and distribution of fatality in individual districts as soon as the earthquake occurrences, it not only make emergency programs and disaster management more effective but also supply critical information to plan and manage the disaster and the allotments of disaster rescue manpower and medicine resources in a timely manner. In this study, we intend to reach the estimation of death tolls caused by the Chi-Chi earthquake in individual districts based on the Attributive Database of Victims, population data, digital maps and Geographic Information Systems. In general, there were involved many factors including the characteristics of ground motions, geological conditions, types and usage habits of buildings, distribution of population and social-economic situations etc., all are related to the damage and losses induced by the disastrous earthquake. The density of seismic stations in Taiwan is the greatest in the world at present. In the meantime, it is easy to get complete seismic data by earthquake rapid-reporting systems from the Central Weather Bureau: mostly within about a minute or less after the earthquake happened. Therefore, it becomes possible to estimate death tolls caused by the earthquake in Taiwan based on the preliminary information. Firstly, we form the arithmetic mean of the three components of the Peak Ground Acceleration (PGA) to give the PGA Index for each individual seismic station, according to the mainshock data of the Chi-Chi earthquake. To supply the distribution of Iso-seismic Intensity Contours in any districts and resolve the problems for which there are no seismic station within partial districts through the PGA Index and geographical coordinates in individual seismic station, the Kriging Interpolation Method and the GIS software, The population density depends on

  7. Rapid, Robust Characterization of Subduction Zone Earthquakes (United States)

    Irwin, Tisha Christine

    Energy is an important factor in international relations and recently the global energy paradigm has been seen to be shifting towards the East. In light of such change, a comparative assessment of the role of energy in Qatar' East Asian foreign relations will be conducted by taking China, Japan and South Korea as case studies. The research aimed to assess each of the bilateral relationship in terms of their origin and development in the energy sector generating an interpretation of their growing interdependence, taking into consideration the various domestic, regional and international influencing factors. At this level, LNG development and trade was adopted to see the extent of energy cooperation. In general, energy cooperation played the leading role in the three relationships, but to different degrees. Furthermore, all three bilateral relationship pertain to the 'complex interdependence approach' that is supported by the use of institutionalism and soft power.

  8. Neurophysiological basis of rapid eye movement sleep behavior disorder: informing future drug development

    Directory of Open Access Journals (Sweden)

    Jennum P


    Full Text Available Poul Jennum, Julie AE Christensen, Marielle Zoetmulder Department of Clinical Neurophysiology, Faculty of Health Sciences, Danish Center for Sleep Medicine, Rigshospitalet, University of Copenhagen, Copenhagen, Denmark Abstract: Rapid eye movement (REM sleep behavior disorder (RBD is a parasomnia characterized by a history of recurrent nocturnal dream enactment behavior and loss of skeletal muscle atonia and increased phasic muscle activity during REM sleep: REM sleep without atonia. RBD and associated comorbidities have recently been identified as one of the most specific and potentially sensitive risk factors for later development of any of the alpha-synucleinopathies: Parkinson’s disease, dementia with Lewy bodies, and other atypical parkinsonian syndromes. Several other sleep-related abnormalities have recently been identified in patients with RBD/Parkinson’s disease who experience abnormalities in sleep electroencephalographic frequencies, sleep–wake transitions, wake and sleep stability, occurrence and morphology of sleep spindles, and electrooculography measures. These findings suggest a gradual involvement of the brainstem and other structures, which is in line with the gradual involvement known in these disorders. We propose that these findings may help identify biomarkers of individuals at high risk of subsequent conversion to parkinsonism. Keywords: motor control, brain stem, hypothalamus, hypocretin

  9. Neurophysiological basis of rapid eye movement sleep behavior disorder: informing future drug development (United States)

    Jennum, Poul; Christensen, Julie AE; Zoetmulder, Marielle


    Rapid eye movement (REM) sleep behavior disorder (RBD) is a parasomnia characterized by a history of recurrent nocturnal dream enactment behavior and loss of skeletal muscle atonia and increased phasic muscle activity during REM sleep: REM sleep without atonia. RBD and associated comorbidities have recently been identified as one of the most specific and potentially sensitive risk factors for later development of any of the alpha-synucleinopathies: Parkinson’s disease, dementia with Lewy bodies, and other atypical parkinsonian syndromes. Several other sleep-related abnormalities have recently been identified in patients with RBD/Parkinson’s disease who experience abnormalities in sleep electroencephalographic frequencies, sleep–wake transitions, wake and sleep stability, occurrence and morphology of sleep spindles, and electrooculography measures. These findings suggest a gradual involvement of the brainstem and other structures, which is in line with the gradual involvement known in these disorders. We propose that these findings may help identify biomarkers of individuals at high risk of subsequent conversion to parkinsonism. PMID:27186147

  10. A rapid evidence-based service by librarians provided information to answer primary care clinical questions. (United States)

    McGowan, Jessie; Hogg, William; Rader, Tamara; Salzwedel, Doug; Worster, Danielle; Cogo, Elise; Rowan, Margo


    A librarian consultation service was offered to 88 primary care clinicians during office hours. This included a streamlined evidence-based process to answer questions in fewer than 20 min. This included a contact centre accessed through a Web-based platform and using hand-held devices and computers with Web access. Librarians were given technical training in evidence-based medicine, including how to summarise evidence. To describe the process and lessons learned from developing and operating a rapid response librarian consultation service for primary care clinicians. Evaluation included librarian interviews and a clinician exit satisfaction survey. Clinicians were positive about its impact on their clinical practice and decision making. The project revealed some important 'lessons learned' in the clinical use of hand-held devices, knowledge translation and training for clinicians and librarians. The Just-in-Time Librarian Consultation Service showed that it was possible to provide evidence-based answers to clinical questions in 15 min or less. The project overcame a number of barriers using innovative solutions. There are many opportunities to build on this experience for future joint projects of librarians and healthcare providers.

  11. Informing climate models with rapid chamber measurements of forest carbon uptake. (United States)

    Metcalfe, Daniel B; Ricciuto, Daniel; Palmroth, Sari; Campbell, Catherine; Hurry, Vaughan; Mao, Jiafu; Keel, Sonja G; Linder, Sune; Shi, Xiaoying; Näsholm, Torgny; Ohlsson, Klas E A; Blackburn, M; Thornton, Peter E; Oren, Ram


    Models predicting ecosystem carbon dioxide (CO 2 ) exchange under future climate change rely on relatively few real-world tests of their assumptions and outputs. Here, we demonstrate a rapid and cost-effective method to estimate CO 2 exchange from intact vegetation patches under varying atmospheric CO 2 concentrations . We find that net ecosystem CO 2 uptake (NEE) in a boreal forest rose linearly by 4.7 ± 0.2% of the current ambient rate for every 10 ppm CO 2 increase, with no detectable influence of foliar biomass, season, or nitrogen (N) fertilization. The lack of any clear short-term NEE response to fertilization in such an N-limited system is inconsistent with the instantaneous downregulation of photosynthesis formalized in many global models. Incorporating an alternative mechanism with considerable empirical support - diversion of excess carbon to storage compounds - into an existing earth system model brings the model output into closer agreement with our field measurements. A global simulation incorporating this modified model reduces a long-standing mismatch between the modeled and observed seasonal amplitude of atmospheric CO 2 . Wider application of this chamber approach would provide critical data needed to further improve modeled projections of biosphere-atmosphere CO 2 exchange in a changing climate. © 2016 John Wiley & Sons Ltd.

  12. Contribution of radioisotope examination of kidneys to rapid informative assessment of causes of polyuria

    Energy Technology Data Exchange (ETDEWEB)

    Buchanec, J.; Kliment, J.; Belakova, S.; Antonyova, M. (Komenskeho Univ., Martin (Czechoslovakia). Lekarska Fakulta)


    The results are evaluated of examinations carried out in 47 children in the ages between 2 months and 15 years. The children were divided into two groups according to the cause of the disease. The first group included those with isolated tubulopathy or with disorders of the secretion of the antidiuretic hormone, the second group were children suffering. from polyuria caused by structural changes of the kidney parenchyma. The patients of the first group were characterized by high and steep functional curves with shortened times of semiquantitative evaluation. In phase scintigraphy the kidneys of these patients show a quick retention and excretion of the radiopharmaceutical, the bladder is large and fills intensively. In patients of the second group with damaged kidney parenchyma the functional curves are low and flat, phase scintigraphy shows the kidneys as a low-intensity shadow with blurred edges, the bladder fills slowly and varies in size. The examination may be used as a rapid test to distinguish the basic groups of patients, namely infants and patients at the onset of the disease.

  13. Contribution of radioisotope examination of kidneys to rapid informative assessment of causes of polyuria

    International Nuclear Information System (INIS)

    Buchanec, J.; Kliment, J.; Belakova, S.; Antonyova, M.


    The results are evaluated of examinations carried out in 47 children in the ages between 2 months and 15 years. The children were divided into two groups according to the cause of the disease. The first group included those with isolated tubulopathy or with disorders of the secretion of the antidiuretic hormone, the second group were children sufferino. from polyuria caused by structural changes of the kidney parenchyma. The patients of the first group were characterized by high and steep functional curves with shortened times of semiquantitative evaluation. In phase scintigraphy the kidneys of these patients show a quick retention and excretion of the radiopharmaceutical, the bladder is large and fills intensively. In patients of the second group with damaged kidney parenchyma the functional curves are low and flat, phase scintigraphy shows the kidneys as a low-intensity shadow with blurred edges, the bladder fills slowly and varies in size. The examination may be used as a rapid test to distinguish the basic groups of patients, namely infants and patients at the onset of the disease

  14. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field (United States)

    Hunter, Paul


    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  15. Impaired encoding of rapid pitch information underlies perception and memory deficits in congenital amusia. (United States)

    Albouy, Philippe; Cousineau, Marion; Caclin, Anne; Tillmann, Barbara; Peretz, Isabelle


    Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia or specific language impairment might be a low-level sensory dysfunction. In the present study we test this hypothesis in congenital amusia, a neurodevelopmental disorder characterized by severe deficits in the processing of pitch-based material. We manipulated the temporal characteristics of auditory stimuli and investigated the influence of the time given to encode pitch information on participants' performance in discrimination and short-term memory. Our results show that amusics' performance in such tasks scales with the duration available to encode acoustic information. This suggests that in auditory neuro-developmental disorders, abnormalities in early steps of the auditory processing can underlie the high-level deficits (here musical disabilities). Observing that the slowing down of temporal dynamics improves amusics' pitch abilities allows considering this approach as a potential tool for remediation in developmental auditory disorders.

  16. Impaired encoding of rapid pitch information underlies perception and memory deficits in congenital amusia


    Philippe Albouy; Marion Cousineau; Anne Caclin; Barbara Tillmann; Isabelle Peretz


    Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia or specific language impairment might be a low-level sensory dysfunction. In the present study we test this hypothesis in congenital amusia, a neurodevelopmental disorder characterized by severe deficits in the processing of pitch-based material. We manipulated the temporal characteristics of auditory stimuli and investigated the influence of the time given to encode pitch information on participa...

  17. Rapid access to information resources in clinical biochemistry: medical applications of Personal Digital Assistants (PDA). (United States)

    Serdar, Muhittin A; Turan, Mustafa; Cihan, Murat


    Laboratory specialists currently need to access scientific-based information at anytime and anywhere. A considerable period of time and too much effort are required to access this information through existing accumulated data. Personal digital assistants (PDA) are supposed to provide an effective solution with commercial software for this problem. In this study, 11 commercial software products (UpToDate, ePocrates, Inforetrive, Pepid, eMedicine, FIRST Consult, and 5 laboratory e-books released by Skyscape and/or Isilo) were selected and the benefits of their use were evaluated by seven laboratory specialists. The assessment of the software was performed based on the number of the tests included, the software content of detailed information for each test-like process, method, interpretation of results, reference ranges, critical values, interferences, equations, pathophysiology, supplementary technical details such as sample collection principles, and additional information such as linked references, evidence-based data, test cost, etc. In terms of technique, the following items are considered: the amount of memory required to run the software, the graphical user interface, which is a user-friendly instrument, and the frequency of new and/or up-date releases. There is still no perfect program, as we have anticipated. Interpretation of laboratory results may require software with an integrated program. However, methodological data are mostly not included in the software evaluated. It seems that these shortcomings will be fixed in the near future, and PDAs and relevant medical applications will also become indispensable for all physicians including laboratory specialists in the field of training/education and in patient care.



    Wang, S.; Wang, X.; Dou, A.; Yuan, X.; Ding, L.; Ding, X.


    The rapid collection of Unmanned Aerial Vehicle (UAV) remote sensing images plays an important role in the fast submitting disaster information and the monitored serious damaged objects after the earthquake. However, for hundreds of UAV images collected in one flight sortie, the traditional data processing methods are image stitching and three-dimensional reconstruction, which take one to several hours, and affect the speed of disaster response. If the manual searching method is employed, we ...

  19. Motivation and challenges for use of malaria rapid diagnostic tests among informal providers in Myanmar: a qualitative study. (United States)

    Sudhinaraset, May; Briegleb, Christina; Aung, Moe; Khin, Hnin Su Su; Aung, Tin


    Rapid diagnostic tests (RDTs) for malaria enable proper diagnosis and have been shown to reduce overuse of artemisinin combination therapy. Few studies have evaluated the feasibility and use of RDTs in the private sector in Myanmar. The objectives of the study were to: 1) understand the acceptability of using RDTs in the informal sector in Myanmar; 2) examine motivations for use among informal providers; and, 3) highlight decision-making and knowledge of providers for diagnostic testing and treatment. Qualitative interviews were conducted with 30 informal providers. Purposeful sampling was used to enrol study participants in the Mon and Shan State in Myanmar. All interviews were conducted in Burmese, translated into English, and two researchers coded all interviews using Atlas ti. Major themes identified included: 1) informal provider and outlet characteristics, including demographic and background characteristics; 2) the benefits and challenges of using RDTs according to providers; 3) provider experiences with using RDTs, including motivations for using the RDT; 4) adherence to test results, either positive or negative; and, 5) recommendations from informal providers to promote increased use of RDTs in their communities. This study found that introducing RDTs to informal providers in Myanmar was feasible, resulting in improved provider empowerment and patient-provider relationships. Specific challenges included facility infrastructure to use and dispose RDTs and provider knowledge. This varied across the type of informal provider, with itinerant drug vendors more comfortable and knowledgeable about RDTs compared to general retail sellers and medical drug representatives. This study found informal providers in Myanmar found the introduction of RDTs to be highly acceptable. Providers discussed improvement in service quality including provider empowerment and patient-provider relationships. The study also highlighted a number of challenges that informal providers

  20. An asynchronous rapid single-flux-quantum demultiplexer based on dual-rail information coding

    International Nuclear Information System (INIS)

    Dimov, B; Khabipov, M; Balashov, D; Brandt, C M; Buchholz, F-Im; Niemeyer, J; Uhlmann, F H


    We present a novel asynchronous RSFQ demultiplexer based on dual-rail information coding. The electrical scheme of the circuit is designed and optimized to maximize the margins of its elements and to improve the fabrication yield. This optimized scheme has been fabricated with the 4 μm 1 kA cm -2 Nb/Al 2 O 3 -Al/Nb technology of PTB-Braunschweig. The demultiplexer has been tested with different samples of the low-speed incoming data stream and in all cases a correct circuit functionality has been observed

  1. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.


    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  2. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.


    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  3. Characterization of the Virginia earthquake effects and source parameters from website traffic analysis (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Roussel, F.


    of inhabitants than localities having experienced weak ground motion. In other words, we observe higher proportion of visitors from localities where the earthquake was widely felt when compared to localities where it was scarcely felt. This opens the way to automatically map the relative level of shaking within minutes of an earthquake's occurrence. In conclusion, the study of the Virginia earthquake shows that eyewitnesses' visits to our website follow the arrival of the P waves at their location. This further demonstrates the real time public desire of information after felt earthquakes, a parameter which should be integrated in the definition of earthquake information services. It also reveals additional capabilities of the flashsourcing method. Earthquakes felt at large distances i.e. where the propagation time to the most distant eyewitnesses exceeds a couple of minutes, can be located and their magnitude estimated in a time frame comparable to the one of automatic seismic locations by real time seismic networks. It also provides very rapid indication on the effects of the earthquakes, by mapping the felt area, detecting the localities affected by network disruption and mapping the relative level of shaking. Such information are essential to improve situation awareness, constrain real time scenario and in in turn, contribute to improved earthquake response.

  4. Shared care in mental illness: A rapid review to inform implementation

    Directory of Open Access Journals (Sweden)

    Kelly Brian J


    Full Text Available Abstract Background While integrated primary healthcare for the management of depression has been well researched, appropriate models of primary care for people with severe and persistent psychotic disorders are poorly understood. In 2010 the NSW (Australia Health Department commissioned a review of the evidence on "shared care" models of ambulatory mental health services. This focussed on critical factors in the implementation of these models in clinical practice, with a view to providing policy direction. The review excluded evidence about dementia, substance use and personality disorders. Methods A rapid review involving a search for systematic reviews on The Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effects (DARE. This was followed by a search for papers published since these systematic reviews on Medline and supplemented by limited iterative searching from reference lists. Results Shared care trials report improved mental and physical health outcomes in some clinical settings with improved social function, self management skills, service acceptability and reduced hospitalisation. Other benefits include improved access to specialist care, better engagement with and acceptability of mental health services. Limited economic evaluation shows significant set up costs, reduced patient costs and service savings often realised by other providers. Nevertheless these findings are not evident across all clinical groups. Gains require substantial cross-organisational commitment, carefully designed and consistently delivered interventions, with attention to staff selection, training and supervision. Effective models incorporated linkages across various service levels, clinical monitoring within agreed treatment protocols, improved continuity and comprehensiveness of services. Conclusions "Shared Care" models of mental health service delivery require attention to multiple levels (from organisational to individual

  5. Vrancea earthquakes. Courses for specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru


    Earthquakes in the Carpathian-Pannonian region are confined to the crust, except the Vrancea zone, where earthquakes with focal depth down to 200 Km occur. For example, the ruptured area migrated from 150 km to 180 km (November 10,1940, M w = 7.7) from 90 km to 110 km (March 4, 1977, M w 7.4), from 130 km to 150 km (August 30, 1986, M w = 7.1) and from 70 km to 90 km (May 30, 1990, M w = 6.9) depth. The depth interval between 110 km and 130 km remains not ruptured since 1802, October 26, when it was the strongest earthquake occurred in this part of Central Europe. The magnitude is assumed to be M w = 7.9 - 8.0 and this depth interval is a natural candidate for the next strong Vrancea event. While no country in the world is entirely safe, the lack of capacity to limit the impact of seismic hazards remains a major burden for all countries and while the world has witnessed an exponential increase in human and material losses due to natural disasters given by earthquakes, there is a need to reverse trends in seismic risk mitigation to future events. Main courses for specific actions to mitigate the seismic risk given by strong deep Vrancea earthquakes should be considered as key for development actions: - Early warning system for industrial facilities. Early warning is more than a technological instrument to detect, monitor and submit warnings. It should become part of a management information system for decision-making in the context of national institutional frameworks for disaster management and part of national and local strategies and programmers for risk mitigation; - Prediction program of Vrancea strong earthquakes of short and long term; - Hazard seismic map of Romania. The wrong assessment of the seismic hazard can lead to dramatic situations as those from Bucharest or Kobe. Before the 1977 Vrancea earthquake, the city of Bucharest was designed to intensity I = VII (MMI) and the real intensity was I = IX1/2-X (MMI); - Seismic microzonation of large populated

  6. U.S. Tsunami Information technology (TIM) Modernization:Developing a Maintainable and Extensible Open Source Earthquake and Tsunami Warning System (United States)

    Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.


    Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.

  7. Parallelization of the Coupled Earthquake Model (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.


    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  8. Earthquakes: hydrogeochemical precursors (United States)

    Ingebritsen, Steven E.; Manga, Michael


    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  9. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H


    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  10. A Rapid Monitoring and Evaluation Method of Schistosomiasis Based on Spatial Information Technology. (United States)

    Wang, Yong; Zhuang, Dafang


    Thanks to Spatial Information Technologies (SITs) such as Remote Sensing (RS) and Geographical Information System (GIS) that are being quickly developed and updated, SITs are being used more widely in the public health field. The use of SITs to study the characteristics of the temporal and spatial distribution of Schistosoma japonicum and to assess the risk of infection provides methods for the control and prevention of schistosomiasis japonica has gradually become a hot topic in the field. The purpose of the present paper was to use RS and GIS technology to develop an efficient method of prediction and assessment of the risk of schistosomiasis japonica. We choose the Yueyang region, close to the east DongTing Lake (Hunan Province, China), as the study area, where a recent serious outbreak of schistosomiasis japonica took place. We monitored and evaluated the transmission risk of schistosomiasis japonica in the region using SITs. Water distribution data were extracted from RS images. The ground temperature, ground humidity and vegetation index were calculated based on RS images. Additionally, the density of oncomelania snails, which are the Schistosoma japonicum intermediate host, was calculated on the base of RS data and field measurements. The spatial distribution of oncomelania snails was explored using SITs in order to estimate the area surrounding the residents with transmission risk of schistosomiasis japonica. Our research result demonstrated: (1) the risk factors for the transmission of schistosomiasis japonica were closely related to the living environment of oncomelania snails. Key factors such as water distribution, ground temperature, ground humidity and vegetation index can be quickly obtained and calculated from RS images; (2) using GIS technology and a RS deduction technique along with statistical regression models, the density distribution model of oncomelania snails could be quickly built; (3) using SITs and analysis with overlaying population

  11. Rapid deterioration of pain sensory-discriminative information in short-term memory. (United States)

    Rainville, Pierre; Doucet, Jean-Charles; Fortin, Marie-Chantale; Duncan, Gary H


    The assessment of pain and analgesic efficacy sometimes relies on the retrospective evaluation of pain felt in the immediate, recent or distant past, yet we have a very limited understanding of the processes involved in the encoding, maintenance and intentional retrieval of pain. We examine the properties of the short-term memory of thermal and pain sensation intensity with a delayed-discrimination task using pairs of heat pain, warm and cool stimulation in healthy volunteers. Performance decreased as a function of the inter-stimulus interval (ISI), indicating a robust deterioration of sensory information over the test period of 4-14 s. As expected, performance also decreased with smaller temperature differences (Delta-T) and shorter stimulus durations (6-2 s). The relation between performance and Delta-T was adequately described by a power function, the exponent of which increased linearly with longer ISI. Importantly, performance declined steadily with increasing ISI (from 6 to 14 s)--but only for pairs of heat pain stimuli that were relatively difficult to discriminate (Delta-T short-term memory for pain and temperature sensation intensity relies on a transient analog representation that is quickly degraded and transformed into a more resistant but less precise categorical format. This implies that retrospective pain ratings obtained even after very short delays may be rather inaccurate but relatively reliable.

  12. Earthquake Warning Performance in Vallejo for the South Napa Earthquake (United States)

    Wurman, G.; Price, M.


    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  13. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.


    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  14. Children's Ideas about Earthquakes (United States)

    Simsek, Canan Lacin


    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  15. Mexican Earthquakes and Tsunamis Catalog Reviewed (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.


    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  16. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps (United States)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.


    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  17. A rapid estimation of tsunami run-up based on finite fault models (United States)

    Campos, J.; Fuentes, M. A.; Hayes, G. P.; Barrientos, S. E.; Riquelme, S.


    Many efforts have been made to estimate the maximum run-up height of tsunamis associated with large earthquakes. This is a difficult task, because of the time it takes to construct a tsunami model using real time data from the source. It is possible to construct a database of potential seismic sources and their corresponding tsunami a priori. However, such models are generally based on uniform slip distributions and thus oversimplify our knowledge of the earthquake source. Instead, we can use finite fault models of earthquakes to give a more accurate prediction of the tsunami run-up. Here we show how to accurately predict tsunami run-up from any seismic source model using an analytic solution found by Fuentes et al, 2013 that was especially calculated for zones with a very well defined strike, i.e, Chile, Japan, Alaska, etc. The main idea of this work is to produce a tool for emergency response, trading off accuracy for quickness. Our solutions for three large earthquakes are promising. Here we compute models of the run-up for the 2010 Mw 8.8 Maule Earthquake, the 2011 Mw 9.0 Tohoku Earthquake, and the recent 2014 Mw 8.2 Iquique Earthquake. Our maximum rup-up predictions are consistent with measurements made inland after each event, with a peak of 15 to 20 m for Maule, 40 m for Tohoku, and 2,1 m for the Iquique earthquake. Considering recent advances made in the analysis of real time GPS data and the ability to rapidly resolve the finiteness of a large earthquake close to existing GPS networks, it will be possible in the near future to perform these calculations within the first five minutes after the occurrence of any such event. Such calculations will thus provide more accurate run-up information than is otherwise available from existing uniform-slip seismic source databases.

  18. Cochrane Rapid Reviews Methods Group to play a leading role in guiding the production of informed high-quality, timely research evidence syntheses. (United States)

    Garritty, Chantelle; Stevens, Adrienne; Gartlehner, Gerald; King, Valerie; Kamel, Chris


    Policymakers and healthcare stakeholders are increasingly seeking evidence to inform the policymaking process, and often use existing or commissioned systematic reviews to inform decisions. However, the methodologies that make systematic reviews authoritative take time, typically 1 to 2 years to complete. Outside the traditional SR timeline, "rapid reviews" have emerged as an efficient tool to get evidence to decision-makers more quickly. However, the use of rapid reviews does present challenges. To date, there has been limited published empirical information about this approach to compiling evidence. Thus, it remains a poorly understood and ill-defined set of diverse methodologies with various labels. In recent years, the need to further explore rapid review methods, characteristics, and their use has been recognized by a growing network of healthcare researchers, policymakers, and organizations, several with ties to Cochrane, which is recognized as representing an international gold standard for high-quality, systematic reviews. In this commentary, we introduce the newly established Cochrane Rapid Reviews Methods Group developed to play a leading role in guiding the production of rapid reviews given they are increasingly employed as a research synthesis tool to support timely evidence-informed decision-making. We discuss how the group was formed and outline the group's structure and remit. We also discuss the need to establish a more robust evidence base for rapid reviews in the published literature, and the importance of promoting registration of rapid review protocols in an effort to promote efficiency and transparency in research. As with standard systematic reviews, the core principles of evidence-based synthesis should apply to rapid reviews in order to minimize bias to the extent possible. The Cochrane Rapid Reviews Methods Group will serve to establish a network of rapid review stakeholders and provide a forum for discussion and training. By facilitating

  19. Earthquake outlook for the San Francisco Bay region 2014–2043 (United States)

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn


    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  20. CE-PA: A user's manual for determination of controlling earthquakes and development of seismic hazard information data base for the central and eastern United States

    International Nuclear Information System (INIS)

    Short, C.


    The CE-PA, Controlling Earthquake(s) through Probabilistic Analysis, software package developed at Lawrence Livermore National Laboratory (LLNL) is a research program used as part of a study performed for the US Office of Nuclear Regulatory Research Division Engineering project on Geosciences Issues in the revision of geological siting criteria. The objectives of this study were to explore ways on how to use results from probabilistic seismic hazard characterization (PSHC) to determine hazard-consistent scenario earthquakes and to develop design ground motion. The purpose of this document is to describe the CE-PA software to users. The software includes two operating system and process controllers plus several fortran routines and input decks. This manual gives an overview of the methodology to estimate controlling earthquakes in Section I. A descriptive overview of the procedures and the organization of the program modules used in CE-PA is provided in Section II. Section III contains four example executions with comments and a graphical display of each execution path, plus an overview of the directory/file structure. Section IV provides some general observations regarding the model

  1. Crowdsourced earthquake early warning (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.


    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  2. Lncident: A Tool for Rapid Identification of Long Noncoding RNAs Utilizing Sequence Intrinsic Composition and Open Reading Frame Information

    Directory of Open Access Journals (Sweden)

    Siyu Han


    Full Text Available More and more studies have demonstrated that long noncoding RNAs (lncRNAs play critical roles in diversity of biological process and are also associated with various types of disease. How to rapidly identify lncRNAs and messenger RNA is the fundamental step to uncover the function of lncRNAs identification. Here, we present a novel method for rapid identification of lncRNAs utilizing sequence intrinsic composition features and open reading frame information based on support vector machine model, named as Lncident (LncRNAs identification. The 10-fold cross-validation and ROC curve are used to evaluate the performance of Lncident. The main advantage of Lncident is high speed without the loss of accuracy. Compared with the exiting popular tools, Lncident outperforms Coding-Potential Calculator, Coding-Potential Assessment Tool, Coding-Noncoding Index, and PLEK. Lncident is also much faster than Coding-Potential Calculator and Coding-Noncoding Index. Lncident presents an outstanding performance on microorganism, which offers a great application prospect to the analysis of microorganism. In addition, Lncident can be trained by users’ own collected data. Furthermore, R package and web server are simultaneously developed in order to maximize the convenience for the users. The R package “Lncident” can be easily installed on multiple operating system platforms, as long as R is supported.

  3. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.


    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  4. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.


    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  5. Earthquake Education in Prime Time (United States)

    de Groot, R.; Abbott, P.; Benthien, M.


    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  6. Modeling fast and slow earthquakes at various scales. (United States)

    Ide, Satoshi


    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  7. Marmara Island earthquakes, of 1265 and 1935; Turkey

    Directory of Open Access Journals (Sweden)

    Y. Altınok


    Full Text Available The long-term seismicity of the Marmara Sea region in northwestern Turkey is relatively well-recorded. Some large and some of the smaller events are clearly associated with fault zones known to be seismically active, which have distinct morphological expressions and have generated damaging earthquakes before and later. Some less common and moderate size earthquakes have occurred in the vicinity of the Marmara Islands in the west Marmara Sea. This paper presents an extended summary of the most important earthquakes that have occurred in 1265 and 1935 and have since been known as the Marmara Island earthquakes. The informative data and the approaches used have therefore the potential of documenting earthquake ruptures of fault segments and may extend the records kept on earthquakes far before known history, rock falls and abnormal sea waves observed during these events, thus improving hazard evaluations and the fundamental understanding of the process of an earthquake.

  8. The costs and benefits of reconstruction options in Nepal using the CEDIM FDA modelled and empirical analysis following the 2015 earthquake (United States)

    Daniell, James; Schaefer, Andreas; Wenzel, Friedemann; Khazai, Bijan; Girard, Trevor; Kunz-Plapp, Tina; Kunz, Michael; Muehr, Bernhard


    Over the days following the 2015 Nepal earthquake, rapid loss estimates of deaths and the economic loss and reconstruction cost were undertaken by our research group in conjunction with the World Bank. This modelling relied on historic losses from other Nepal earthquakes as well as detailed socioeconomic data and earthquake loss information via CATDAT. The modelled results were very close to the final death toll and reconstruction cost for the 2015 earthquake of around 9000 deaths and a direct building loss of ca. 3 billion (a). A description of the process undertaken to produce these loss estimates is described and the potential for use in analysing reconstruction costs from future Nepal earthquakes in rapid time post-event. The reconstruction cost and death toll model is then used as the base model for the examination of the effect of spending money on earthquake retrofitting of buildings versus complete reconstruction of buildings. This is undertaken future events using empirical statistics from past events along with further analytical modelling. The effects of investment vs. the time of a future event is also explored. Preliminary low-cost options (b) along the line of other country studies for retrofitting (ca. 100) are examined versus the option of different building typologies in Nepal as well as investment in various sectors of construction. The effect of public vs. private capital expenditure post-earthquake is also explored as part of this analysis, as well as spending on other components outside of earthquakes. a) b)

  9. Cost-effectiveness analysis of malaria rapid diagnostic test incentive schemes for informal private healthcare providers in Myanmar. (United States)

    Chen, Ingrid T; Aung, Tin; Thant, Hnin Nwe Nwe; Sudhinaraset, May; Kahn, James G


    The emergence of artemisinin-resistant Plasmodium falciparum parasites in Southeast Asia threatens global malaria control efforts. One strategy to counter this problem is a subsidy of malaria rapid diagnostic tests (RDTs) and artemisinin-based combination therapy (ACT) within the informal private sector, where the majority of malaria care in Myanmar is provided. A study in Myanmar evaluated the effectiveness of financial incentives vs information, education and counselling (IEC) in driving the proper use of subsidized malaria RDTs among informal private providers. This cost-effectiveness analysis compares intervention options. A decision tree was constructed in a spreadsheet to estimate the incremental cost-effectiveness ratios (ICERs) among four strategies: no intervention, simple subsidy, subsidy with financial incentives, and subsidy with IEC. Model inputs included programmatic costs (in dollars), malaria epidemiology and observed study outcomes. Data sources included expenditure records, study data and scientific literature. Model outcomes included the proportion of properly and improperly treated individuals with and without P. falciparum malaria, and associated disability-adjusted life years (DALYs). Results are reported as ICERs in US dollars per DALY averted. One-way sensitivity analysis assessed how outcomes depend on uncertainty in inputs. ICERs from the least to most expensive intervention are: $1,169/DALY averted for simple subsidy vs no intervention, $185/DALY averted for subsidy with financial incentives vs simple subsidy, and $200/DALY averted for a subsidy with IEC vs subsidy with financial incentives. Due to decreasing ICERs, each strategy was also compared to no intervention. The subsidy with IEC was the most favourable, costing $639/DALY averted compared with no intervention. One-way sensitivity analysis shows that ICERs are most affected by programme costs, RDT uptake, treatment-seeking behaviour, and the prevalence and virulence of non

  10. Frictional melt generated by the 2008 Mw 7.9 Wenchuan earthquake and its faulting mechanisms (United States)

    Wang, H.; Li, H.; Si, J.; Sun, Z.; Zhang, L.; He, X.


    Fault-related pseudotachylytes are considered as fossil earthquakes, conveying significant information that provide improved insight into fault behaviors and their mechanical properties. The WFSD project was carried out right after the 2008 Wenchuan earthquake, detailed research was conducted in the drilling cores. 2 mm rigid black layer with fresh slickenlines was observed at 732.6 m in WFSD-1 cores drilled at the southern Yingxiu-Beichuan fault (YBF). Evidence of optical microscopy, FESEM and FIB-TEM show it's frictional melt (pseudotachylyte). In the northern part of YBF, 4 mm fresh melt was found at 1084 m with similar structures in WFSD-4S cores. The melts contain numerous microcracks. Considering that (1) the highly unstable property of the frictional melt (easily be altered or devitrified) under geological conditions; (2) the unfilled microcracks; (3) fresh slickenlines and (4) recent large earthquake in this area, we believe that 2-4 mm melt was produced by the 2008 Wenchuan earthquake. This is the first report of fresh pseudotachylyte with slickenlines in natural fault that generated by modern earthquake. Geochemical analyses show that fault rocks at 732.6 m are enriched in CaO, Fe2O3, FeO, H2O+ and LOI, whereas depleted in SiO2. XRF results show that Ca and Fe are enriched obviously in the 2.5 cm fine-grained fault rocks and Ba enriched in the slip surface. The melt has a higher magnetic susceptibility value, which may due to neoformed magnetite and metallic iron formed in fault frictional melt. Frictional melt visible in both southern and northern part of YBF reveals that frictional melt lubrication played a major role in the Wenchuan earthquake. Instead of vesicles and microlites, numerous randomly oriented microcracks in the melt, exhibiting a quenching texture. The quenching texture suggests the frictional melt was generated under rapid heat-dissipation condition, implying vigorous fluid circulation during the earthquake. We surmise that during

  11. [Comparative analysis of the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquakes]. (United States)

    Shi, Xiao-Jun; Wang, Guang-Lin; Pei, Fu-Xing; Song, Yue-Ming; Yang, Tian-Fu; Tu, Chong-Qi; Huang, Fu-Guo; Liu, Hao; Lin, Wei


    To systematically analyze and compare the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquake, so as to provide useful references for future earthquakes injury rescue. Based on the orthopedic inpatients in Lushan and Wenchuan earthquakes, the data of the age, gender, injury causes, body injured parts and speed of transport were classified and compared. The duration of patients admitted to hospital lasted long and the peak appeared late in Wenchuan earthquake, which is totally opposed to Lushan earthquake. There was no significant difference in the patient's age and gender between the two earthquakes. However, the occurrence rate of crush syndrome, amputation, gas gangrene, vascular injury and multiple organ dysfunction syndrome (MODS) in Wenchuan earthquake was much higher than that in Lushan earthquake. Blunt traumas or crush-related injuries (79.6%) are the major injury cause in Wenchuan earthquake, however, high falling injuries and falls (56.8%) are much higher than blunt trauma or crush-related injuries (39.2%) in Lushan earthquake. The incidence rate of foot fractures, spine fractures and multiple fractures in Lushan earthquake was higher than that in Wenchuan earthquake, but that of open fractures and lower limb fractures was lower than that in Wenchuan earthquake. The rapid rescue scene is the cornerstone of successful treatment, early rescue and transport obviously reduce the incidence of the wound infection, crush syndrome, MODS and amputation. Popularization of correct knowledge of emergency shelters will help to reduce the damage caused by blindly jumping or escaping while earthquake happens.

  12. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake? (United States)

    Madariaga, R.


    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  13. Earthquake at 40 feet (United States)

    Miller, G. J.


    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  14. Earthquakes and economic growth


    Fisker, Peter Simonsen


    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  15. California Earthquake Clearinghouse Crisis Information-Sharing Strategy in Support of Situational Awareness, Understanding Interdependencies of Critical Infrastructure, Regional Resilience, Preparedness, Risk Assessment/mitigation, Decision-Making and Everyday Operational Needs (United States)

    Rosinski, A.; Morentz, J.; Beilin, P.


    The principal function of the California Earthquake Clearinghouse is to provide State and Federal disaster response managers, and the scientific and engineering communities, with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes and tsunamis. The overarching problem highlighted in discussions with Clearinghouse partners is the confusion and frustration of many of the Operational Area representatives, and some regional utilities throughout the state on what software applications they should be using and maintaining to meet State, Federal, and Local, requirements, and for what purposes, and how to deal with the limitations of these applications. This problem is getting in the way of making meaningful progress on developing multi-application interoperability and the necessary supporting cross-sector information-sharing procedures and dialogue on essential common operational information that entities need to share for different all hazards missions and related operational activities associated with continuity, security, and resilience. The XchangeCore based system the Clearinghouse is evolving helps deal with this problem, and does not compound it by introducing yet another end-user application; there is no end-user interface with which one views XchangeCore, all viewing of data provided through XchangeCore occurs in and on existing, third-party operational applications. The Clearinghouse efforts with XchangeCore are compatible with FEMA, which is currently using XchangeCore-provided data for regional and National Business Emergency Operations Center (source of business information sharing during emergencies) response. Also important, and should be emphasized, is that information-sharing is not just for response, but for preparedness, risk assessment/mitigation decision-making, and everyday operational needs for situational awareness. In other words, the benefits of the Clearinghouse

  16. Matching time and spatial scales of rapid solidification: dynamic TEM experiments coupled to CALPHAD-informed phase-field simulations (United States)

    Perron, Aurelien; Roehling, John D.; Turchi, Patrice E. A.; Fattebert, Jean-Luc; McKeown, Joseph T.


    A combination of dynamic transmission electron microscopy (DTEM) experiments and CALPHAD-informed phase-field simulations was used to study rapid solidification in Cu-Ni thin-film alloys. Experiments—conducted in the DTEM—consisted of in situ laser melting and determination of the solidification kinetics by monitoring the solid-liquid interface and the overall microstructure evolution (time-resolved measurements) during the solidification process. Modelling of the Cu-Ni alloy microstructure evolution was based on a phase-field model that included realistic Gibbs energies and diffusion coefficients from the CALPHAD framework (thermodynamic and mobility databases). DTEM and post mortem experiments highlighted the formation of microsegregation-free columnar grains with interface velocities varying from ˜0.1 to ˜0.6 m s-1. After an ‘incubation’ time, the velocity of the planar solid-liquid interface accelerated until solidification was complete. In addition, a decrease of the temperature gradient induced a decrease in the interface velocity. The modelling strategy permitted the simulation (in 1D and 2D) of the solidification process from the initially diffusion-controlled to the nearly partitionless regimes. Finally, results of DTEM experiments and phase-field simulations (grain morphology, solute distribution, and solid-liquid interface velocity) were consistent at similar time (μs) and spatial scales (μm).

  17. Report on the 2010 Chilean earthquake and tsunami response (United States)



    disaster response strategies and operations of Chilean agencies, including perceived or actual failures in disaster preparation that impacted the medical disaster response; post-disaster health and medical interventions to save lives and limit suffering; and the lessons learned by public health and medical personnel as a result of their experiences. Despite devastating damage to the health care and civic infrastructure, the health care response to the Chilean earthquake appeared highly successful due to several factors. Like other first responders, the medical community had the ability and resourcefulness to respond without centralized control in the early response phase. The health care community maintained patient care under austere conditions, despite many obstacles that could have prevented such care. National and international resources were rapidly mobilized to support the medical response. The Emergency Services Team sought to collect information on all phases of emergency management (preparedness, mitigation, response, and recovery) and determine what worked well and what could be improved upon. The Chileans reported being surprised that they were not as ready for this event as they thought they were. The use of mass care sheltering was limited, given the scope of the disaster, because of the resiliency of the population. The impacts of the earthquake and the tsunami were quite different, as were the needs of urban and rural dwellers, necessitating different response activities. The Volunteer Services Team examined the challenges faced in mobilizing a large number of volunteers to assist in the aftermath of a disaster of this scale. One of the greatest challenges expressed was difficulty in communication; the need for redundancy in communication mechanisms was cited. The flexibility and ability to work autonomously by the frontline volunteers was a significant factor in effective response. It was also important for volunteer leadership to know the emergency plans

  18. Prediction of earthquakes: a data evaluation and exchange problem

    Energy Technology Data Exchange (ETDEWEB)

    Melchior, Paul


    Recent experiences in earthquake prediction are recalled. Precursor information seems to be available from geodetic measurements, hydrological and geochemical measurements, electric and magnetic measurements, purely seismic phenomena, and zoological phenomena; some new methods are proposed. A list of possible earthquake triggers is given. The dilatancy model is contrasted with a dry model; they seem to be equally successful. In conclusion, the space and time range of the precursors is discussed in relation to the magnitude of earthquakes. (RWR)

  19. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.


    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  20. Development of earthquake early warning system using real time signal of broadband seismogram

    International Nuclear Information System (INIS)

    Gunawan, Hendar; Puspito, Nanang T.; Ibrahim, Gunawan; Harjadi, Prih


    Earthquake pose serious threat of live and properties for urban area near subduction zone offshore and active fault on land. Jakarta and Bandung is an example of big city that no system of Earthquake early warning (EEW) event very high urbanization, and has many important infra structure in the area. The capital city is potentially high risk ground shaking. EEW can be usefull tool for reducing earthquake hazard, if spatial relation between cities and earthquake source is favorable for such warning and their citizens are properly trained to response early warning message. An EEW and rapid response system can provide the critical information needed to minimized lost of live and property and direct rescue. Earthquake ground shaking with magnitude M>6.0 from zone of Megathrust, southern of West Java should potentially damage in the area of west java especially Bandung and Jakarta City. This research development of EEW parameter such as amplitude displacement (Pd), rapid magnitude determination (M) and Peak ground Velocity (PGV). We explore the practical approach to EEW with the use of Broadband seismogram signal. Time effective EEW which epicenter from megathrust zone has potential to provide EEW in the area of west java such as Jakarta first ground shaking more or less 60 second later and strong shaking 118 second after EEW Alarm on CISI Station. EEW notification at potentially damage in the area of west java can be predicted from the characteristic of Pd > 0.5 cm, M> 6 and PGV > 10 cm/sec. GIS as a tool for presentation of hazard mapping in the affected area.

  1. Implications of fault constitutive properties for earthquake prediction. (United States)

    Dieterich, J H; Kilgore, B


    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  2. Web Services and Other Enhancements at the Northern California Earthquake Data Center (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.


    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  3. Clustered and transient earthquake sequences in mid-continents (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.


    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  4. Temporal stress changes caused by earthquakes: A review (United States)

    Hardebeck, Jeanne L.; Okada, Tomomi


    Earthquakes can change the stress field in the Earth’s lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth’s crust at plate boundaries is “strong” or “weak.” Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  5. Earthquakes and Schools (United States)

    National Clearinghouse for Educational Facilities, 2008


    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  6. Bam Earthquake in Iran

    CERN Multimedia


    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  7. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne


    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  8. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.


    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  9. Interpretation of earthquake-induced landslides triggered by the 12 May 2008, M7.9 Wenchuan earthquake in the Beichuan area, Sichuan Province, China using satellite imagery and Google Earth (United States)

    Sato, H.P.; Harp, E.L.


    The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.

  10. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study. (United States)

    Neale, Chris; Johnston, Patrick; Hughes, Matthew; Scholey, Andrew


    The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI) outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16). The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis), such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi), parietal (precuenus, inferior parietal lobe) and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.

  11. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study.

    Directory of Open Access Journals (Sweden)

    Chris Neale

    Full Text Available The Rapid Visual Information Processing (RVIP task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16. The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis, such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi, parietal (precuenus, inferior parietal lobe and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.

  12. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.


    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  13. Land-Ocean-Atmospheric Coupling Associated with Earthquakes (United States)

    Prasad, A. K.; Singh, R. P.; Kumar, S.; Cervone, G.; Kafatos, M.; Zlotnicki, J.


    Earthquakes are well known to occur along the plate boundaries and also on the stable shield. The recent studies have shown existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes. We have carried out detailed analysis of multi sensor data (optical and microwave remote) to show existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes with focal depth up to 30 km and magnitude greater than 5.5. Complimentary nature of various land, ocean and atmospheric parameters will be demonstrated in getting an early warning information about an impending earthquake.

  14. Real-time earthquake monitoring using a search engine method. (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong


    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  15. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.


    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  16. The plan to coordinate NEHRP post-earthquake investigations (United States)

    Holzer, Thomas L.; Borcherdt, Roger D.; Comartin, Craig D.; Hanson, Robert D.; Scawthorn, Charles R.; Tierney, Kathleen; Youd, T. Leslie


    This is the plan to coordinate domestic and foreign post-earthquake investigations supported by the National Earthquake Hazards Reduction Program (NEHRP). The plan addresses coordination of both the NEHRP agencies—Federal Emergency Management Agency (FEMA), National Institute of Standards and Technology (NIST), National Science Foundation (NSF), and U. S. Geological Survey (USGS)—and their partners. The plan is a framework for both coordinating what is going to be done and identifying responsibilities for post-earthquake investigations. It does not specify what will be done. Coordination is addressed in various time frames ranging from hours to years after an earthquake. The plan includes measures for (1) gaining rapid and general agreement on high-priority research opportunities, and (2) conducting the data gathering and fi eld studies in a coordinated manner. It deals with identifi cation, collection, processing, documentation, archiving, and dissemination of the results of post-earthquake work in a timely manner and easily accessible format.

  17. Earthquake risk assessment of Alexandria, Egypt (United States)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza


    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  18. Napa earthquake: An earthquake in a highly connected world (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.


    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  19. Investigating landslides caused by earthquakes - A historical review (United States)

    Keefer, D.K.


    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  20. The earthquake/seismic risk, vulnerability and capacity profile for ...

    African Journals Online (AJOL)

    The study was carried out to understand the risks posed by earthquakes in Karonga based on roles and perception of stakeholders. Information was collected from several stakeholders who were found responding to earthquakes impacts in Karonga Town. The study found that several stakeholders, governmental and ...

  1. Earthquakes, November-December 1977 (United States)

    Person, W.J.


    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  2. Earthquakes, September-October 1986 (United States)

    Person, W.J.


    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  3. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten


    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  4. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.


    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  5. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications (United States)

    Wu, Stephen

    Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that

  6. U.S. Geological Survey (USGS) Earthquake Web Applications (United States)

    Fee, J.; Martinez, E.


    USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers.

  7. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo


    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  8. Sun, Moon and Earthquakes (United States)

    Kolvankar, V. G.


    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  9. Focal Mechanism of a Catastrophic Earthquake of the Last Rococo Period (1783) in Southern Italy Retrieved by Inverting Historical Information on Damage (United States)

    Sirovich, L.; Pettenati, F.


    Using geophysical inversion to discover the fault source of a blind earthquake, that took place before the invention of the seismograph, seemed impossible. We demonstrated that sometimes it is possible using our simplified KF model (Sirovich, 1996) through automatic genetic inversion (Gentile et al., 2004 in BSSA; Sirovich and Pettenati, 2004 in JGR), and determined it conclusively by treating the Coalinga 1983, Loma Prieta 1989, and Northridge 1994 earthquakes (Pettenati and Sirovich, 2007 in BSSA). KF is able to simulate the body-wave radiation from a linear source, and eleven source parameters are retrieved: the three nucleation coordinates, the fault-plane solution, the seismic moment, the rupture velocities and lengths along-strike and anti-strike, the shear wave velocity in the half-space. To find the minima on the hypersurface of the residuals in the multi-parameter model space, we use a genetic process with niching since we have already shown that the problem is bimodal for pure dip-slip mechanisms. The objective function of the nonlinear inversion is the sum of the squared residuals (calculated-minus-observed intensity at all sites). Here, we use the very good intensity data provided in the MCS scale by the INGV of Italy for the M 6.9 earthquake of Feb. 5, 1783 (see the Italian intensity data bank on The data of 1783 were created by seismologists and historians who interpreted the reports of the time and many other historical sources. Given the limitations of the KF approach, we limited our inversion to a square area of 200 by 200 km around the most heavily damaged zone. 341 surveyed towns and hamlets received intensity degrees by INGV (we discarded 6 of them as statistical outliers according to the classical Chauvenet method). Thus, 335 data were inverted. The match between experimental and synthetic isoseismals is really noteworthy. The found mechanism is almost pure dip-slip and, thus, the problem is

  10. The impacts of climate change on poverty in 2030, and the potential from rapid, inclusive and climate-informed development (United States)

    Rozenberg, J.; Hallegatte, S.


    There is a consensus on the fact that poor people are more vulnerable to climate change than the rest of the population, but, until recently, few quantified estimates had been proposed and few frameworks existed to design policies for addressing the issue. In this paper, we analyze the impacts of climate change on poverty using micro-simulation approaches. We start from household surveys that describe the current distribution of income and occupations, we project these households into the future and we look at the impacts of climate change on people's income. To project households into the future, we explore a large range of assumptions on future demographic changes (including on education), technological changes, and socio-economic trends (including redistribution policies). This approach allows us to identify the main combination of factors that lead to fast poverty reduction, and the ones that lead to high climate change impacts on the poor. Identifying these factors is critical for designing efficient policies to protect the poorest from climate change impacts and making economic growth more inclusive. Conclusions are twofold. First, by 2030 climate change can have a large impact on poverty, with between 3 and 122 million more people in poverty, but climate change remains a secondary driver of poverty trends within this time horizon. Climate change impacts do not only affect the poorest: in 2030, the bottom 40 percent lose more than 4 percent of income in many countries. The regional hotspots are Sub-Saharan Africa and - to a lesser extent - India and the rest of South Asia. The most important channel through which climate change increases poverty is through agricultural income and food prices. Second, by 2030 and in the absence of surprises on climate impacts, inclusive climate-informed development can prevent most of (but not all) the impacts on poverty. In a scenario with rapid, inclusive and climate-proof development, climate change impact on poverty is

  11. Toward tsunami early warning system in Indonesia by using rapid rupture durations estimation

    International Nuclear Information System (INIS)



    Indonesia has Indonesian Tsunami Early Warning System (Ina-TEWS) since 2008. The Ina-TEWS has used automatic processing on hypocenter; Mwp, Mw (mB) and Mj. If earthquake occurred in Ocean, depth 7, then Ina-TEWS announce early warning that the earthquake can generate tsunami. However, the announcement of the Ina-TEWS is still not accuracy. Purposes of this research are to estimate earthquake rupture duration of large Indonesia earthquakes that occurred in Indian Ocean, Java, Timor sea, Banda sea, Arafura sea and Pasific ocean. We analyzed at least 330 vertical seismogram recorded by IRIS-DMC network using a direct procedure for rapid assessment of earthquake tsunami potential using simple measures on P-wave vertical seismograms on the velocity records, and the likelihood that the high-frequency, apparent rupture duration, T dur . T dur can be related to the critical parameters rupture length (L), depth (z), and shear modulus (μ) while T dur may be related to wide (W), slip (D), z or μ. Our analysis shows that the rupture duration has a stronger influence to generate tsunami than Mw and depth. The rupture duration gives more information on tsunami impact, Mo/μ, depth and size than Mw and other currently used discriminants. We show more information which known from the rupture durations. The longer rupture duration, the shallower source of the earthquake. For rupture duration greater than 50 s, the depth less than 50 km, Mw greater than 7, the longer rupture length, because T dur is proportional L and greater Mo/μ. Because Mo/μ is proportional L. So, with rupture duration information can be known information of the four parameters. We also suggest that tsunami potential is not directly related to the faulting type of source and for events that have rupture duration greater than 50 s, the earthquakes generated tsunami. With available real-time seismogram data, rapid calculation, rupture duration discriminant can be completed within 4–5 min after an earthquake

  12. Using remote sensing to predict earthquake impacts (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia


    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  13. Toward standardization of slow earthquake catalog -Development of database website- (United States)

    Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.


    Slow earthquakes have now been widely discovered in the world based on the recent development of geodetic and seismic observations. Many researchers detect a wide frequency range of slow earthquakes including low frequency tremors, low frequency earthquakes, very low frequency earthquakes and slow slip events by using various methods. Catalogs of the detected slow earthquakes are open to us in different formats by each referring paper or through a website (e.g., Wech 2010; Idehara et al. 2014). However, we need to download catalogs from different sources, to deal with unformatted catalogs and to understand the characteristics of different catalogs, which may be somewhat complex especially for those who are not familiar with slow earthquakes. In order to standardize slow earthquake catalogs and to make such a complicated work easier, Scientific Research on Innovative Areas "Science of Slow Earthquakes" has been developing a slow earthquake catalog website. In the website, we can plot locations of various slow earthquakes via the Google Maps by compiling a variety of slow earthquake catalogs including slow slip events. This enables us to clearly visualize spatial relations among slow earthquakes at a glance and to compare the regional activities of slow earthquakes or the locations of different catalogs. In addition, we can download catalogs in the unified format and refer the information on each catalog on the single website. Such standardization will make it more convenient for users to utilize the previous achievements and to promote research on slow earthquakes, which eventually leads to collaborations with researchers in various fields and further understanding of the mechanisms, environmental conditions, and underlying physics of slow earthquakes. Furthermore, we expect that the website has a leading role in the international standardization of slow earthquake catalogs. We report the overview of the website and the progress of construction. Acknowledgment: This

  14. Real-time earthquake data feasible (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  15. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh


    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica ( Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  16. Prospective testing of Coulomb short-term earthquake forecasts (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.


    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  17. Earthquake Early Warning in Japan - Result of recent two years - (United States)

    Shimoyama, T.; Doi, K.; Kiyomoto, M.; Hoshiba, M.


    Japan Meteorological Agency(JMA) started to provide Earthquake Early Warning(EEW) to the general public in October 2007. It was followed by provision of EEW to a limited number of users who understand the technical limit of EEW and can utilize it for automatic control from August 2006. Earthquake Early Warning in Japan definitely means information of estimated amplitude and arrival time of a strong ground motion after fault rupture occurred. In other words, the EEW provided by JMA is defined as a forecast of a strong ground motion before the strong motion arrival. EEW of JMA is to enable advance countermeasures to disasters caused by strong ground motions with providing a warning message of anticipating strong ground motion before the S wave arrival. However, due to its very short available time period, there should need some measures and ideas to provide rapidly EEW and utilize it properly. - EEW is issued to general public when the maximum seismic intensity 5 lower (JMA scale) or greater is expected. - EEW message contains origin time, epicentral region name, and names of areas (unit is about 1/3 to 1/4 of one prefecture) where seismic intensity 4 or greater is expected. Expected arrival time is not included because it differs substantially even in one unit area. - EEW is to be broadcast through the broadcasting media(TV, radio and City Administrative Disaster Management Radio), and is delivered to cellular phones through cell broadcast system. For those who would like to know the more precise estimation and smaller earthquake information at their point of their properties, JMA allows designated private companies to provide forecast of strong ground motion, in which the estimation of a seismic intensity as well as arrival time of S-wave are contained, at arbitrary places under the JMA’s technical assurance. From October, 2007 to August, 2009, JMA issued 11 warnings to general public expecting seismic intensity “5 lower” or greater, including M=7.2 inland

  18. Earthquake Ground Motion Selection (United States)


    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  19. 1988 Spitak Earthquake Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  20. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes (United States)

    Yamada, T.; Ide, S.


    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  1. Matrix model of information support of the leadership and functional services of the section of rapid information. Matrichnaya model' informatsionnogo obespecheniya rukovodstva i funktsional'nykh sluzhb razreza operativnoy informatsiey

    Energy Technology Data Exchange (ETDEWEB)

    Zevina, V V; Katrichenko, A N


    A mathematical description is made of the matrix model for the rapid presenation of information in the automated control system of the production process of the section using the methods of the theory of relationships. The system of rapid control of the open pit is examined (in an informational aspect) as a set of numerous objects, enlarged indicators and conditions of indicators. The sets are described as frameworks, and the zones of the matrix as a framework for a composite of the aforementioned frameworks. Properties of the matrix, formation of zones and periodicity of presentation are described.

  2. Electromagnetic Manifestation of Earthquakes


    Uvarov Vladimir


    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  3. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir


    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  4. Earthquake early warning using P-waves that appear after initial S-waves (United States)

    Kodera, Y.


    As measures for underprediction for large earthquakes with finite faults and overprediction for multiple simultaneous earthquakes, Hoshiba (2013), Hoshiba and Aoki (2015), and Kodera et al. (2016) proposed earthquake early warning (EEW) methods that directly predict ground motion by computing the wave propagation of observed ground motion. These methods are expected to predict ground motion with a high accuracy even for complicated scenarios because these methods do not need source parameter estimation. On the other hand, there is room for improvement in their rapidity because they predict strong motion prediction mainly based on the observation of S-waves and do not explicitly use P-wave information available before the S-waves. In this research, we propose a real-time P-wave detector to incorporate P-wave information into these wavefield-estimation approaches. P-waves within a few seconds from the P-onsets are commonly used in many existing EEW methods. In addition, we focus on P-waves that may appear in the later part of seismic waves. Kurahashi and Irikura (2013) mentioned that P-waves radiated from strong motion generation areas (SMGAs) were recognizable after S-waves of the initial rupture point in the 2011 off the Pacific coast of Tohoku earthquake (Mw 9.0) (the Tohoku-oki earthquake). Detecting these P-waves would enhance the rapidity of prediction for the peak ground motion generated by SMGAs. We constructed a real-time P-wave detector that uses a polarity analysis. Using acceleration records in boreholes of KiK-net (band-pass filtered around 0.5-10 Hz with site amplification correction), the P-wave detector performed the principal component analysis with a sliding window of 4 s and calculated P-filter values (e.g. Ross and Ben-Zion, 2014). The application to the Tohoku-oki earthquake (Mw 9.0) showed that (1) peaks of P-filter that corresponded to SMGAs appeared in several stations located near SMGAs and (2) real-time seismic intensities (Kunugi et al

  5. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei


    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  6. Investigating Landslides Caused by Earthquakes A Historical Review (United States)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  7. Real-Time Earthquake Intensity Estimation Using Streaming Data Analysis of Social and Physical Sensors (United States)

    Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.


    Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher

  8. ShakeMapple : tapping laptop motion sensors to map the felt extents of an earthquake (United States)

    Bossu, Remy; McGilvary, Gary; Kamb, Linus


    There is a significant pool of untapped sensor resources available in portable computer embedded motion sensors. Included primarily to detect sudden strong motion in order to park the disk heads to prevent damage to the disks in the event of a fall or other severe motion, these sensors may also be tapped for other uses as well. We have developed a system that takes advantage of the Apple Macintosh laptops' embedded Sudden Motion Sensors to record earthquake strong motion data to rapidly build maps of where and to what extent an earthquake has been felt. After an earthquake, it is vital to understand the damage caused especially in urban environments as this is often the scene for large amounts of damage caused by earthquakes. Gathering as much information from these impacts to determine where the areas that are likely to be most effected, can aid in distributing emergency services effectively. The ShakeMapple system operates in the background, continuously saving the most recent data from the motion sensors. After an earthquake has occurred, the ShakeMapple system calculates the peak acceleration within a time window around the expected arrival and sends that to servers at the EMSC. A map plotting the felt responses is then generated and presented on the web. Because large-scale testing of such an application is inherently difficult, we propose to organize a broadly distributed "simulated event" test. The software will be available for download in April, after which we plan to organize a large-scale test by the summer. At a specified time, participating testers will be asked to create their own strong motion to be registered and submitted by the ShakeMapple client. From these responses, a felt map will be produced representing the broadly-felt effects of the simulated event.

  9. Flashsourcing or Real-Time Mapping of Earthquake Effects from Instantaneous Analysis of the EMSC Website Traffic (United States)

    Bossu, R.; Gilles, S.; Roussel, F.


    Earthquake response efforts are often hampered by the lack of timely and reliable information on the earthquake impact. Rapid detection of damaging events and production of actionable information for emergency response personnel within minutes of their occurrence are essential to mitigate the human impacts from earthquakes. Economically developed countries deploy dense real-time accelerometric networks in regions of high seismic hazard to constrain scenarios from in-situ data. A cheaper alternative, named flashsourcing, is based on implicit data derived from the analysis of the visits by eyewitnesses, the first informed persons, to websites offering real time earthquake information. We demonstrated in 2004 that widely felt earthquakes generate a surge of traffic, known as a flashcrowd, caused by people rushing websites such as the EMSC’s to find information about the shaking they have just felt. With detailed traffic analysis and metrics, widely felt earthquakes can be detected within one minute of the earthquake’s occurrence. In addition, the geographical area where the earthquake has been felt is automatically mapped within 5 minutes by statistically analysing the IP locations of the eyewitnesses, without using any seismological data. These results have been validated on more than 150 earthquakes by comparing the automatic felt maps with the felt area derived from macroseismic questionnaires. In practice, the felt maps are available before the first location is published by the EMSC. We have also demonstrated the capacity to rapidly detect and map areas of widespread damage by detecting when visitors suddenly end their sessions on the website en masse. This has been successfully applied to time and map the massive power failure which plunged a large part of Chile into darkness in March, 2010. If damage to power and communication lines cannot be discriminated from damage to buildings, the absence of sudden session closures precludes the possibility of heavy

  10. Charles Darwin's earthquake reports (United States)

    Galiev, Shamil


    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  11. Development of the U.S. Geological Survey's PAGER system (Prompt Assessment of Global Earthquakes for Response) (United States)

    Wald, D.J.; Earle, P.S.; Allen, T.I.; Jaiswal, K.; Porter, K.; Hearne, M.


    The Prompt Assessment of Global Earthquakes for Response (PAGER) System plays a primary alerting role for global earthquake disasters as part of the U.S. Geological Survey’s (USGS) response protocol. We provide an overview of the PAGER system, both of its current capabilities and our ongoing research and development. PAGER monitors the USGS’s near real-time U.S. and global earthquake origins and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts. Current PAGER notifications and Web pages estimate the population exposed to each seismic intensity level. In addition to being a useful indicator of potential impact, PAGER’s intensity/exposure display provides a new standard in the dissemination of rapid earthquake information. We are currently developing and testing a more comprehensive alert system that will include casualty estimates. This is motivated by the idea that an estimated range of possible number of deaths will aid in decisions regarding humanitarian response. Underlying the PAGER exposure and loss models are global earthquake ShakeMap shaking estimates, constrained as quickly as possible by finite-fault modeling and observed ground motions and intensities, when available. Loss modeling is being developed comprehensively with a suite of candidate models that range from fully empirical to largely analytical approaches. Which of these models is most appropriate for use in a particular earthquake depends on how much is known about local building stocks and their vulnerabilities. A first-order country-specific global building inventory has been developed, as have corresponding vulnerability functions. For calibrating PAGER loss models, we have systematically generated an Atlas of 5,000 ShakeMaps for significant global earthquakes during the last 36 years. For many of these, auxiliary earthquake source and shaking intensity data are also available. Refinements to the loss models are ongoing

  12. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf


    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  13. Nowcasting Earthquakes and Tsunamis (United States)

    Rundle, J. B.; Turcotte, D. L.


    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle ( Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  14. Scientists Engage South Carolina Community in Earthquake Education and Preparedness (United States)

    Hall, C.; Beutel, E.; Jaume', S.; Levine, N.; Doyle, B.


    Scientists at the College of Charleston are working with the state of South Carolina's Emergency Management Division to increase awareness and understanding of earthquake hazards throughout South Carolina. As part of this mission, the SCEEP (South Carolina Earthquake Education and Preparedness) program was formed at the College of Charleston to promote earthquake research, outreach, and education in the state of South Carolina. Working with local, regional, state and federal offices, SCEEP has developed education programs for everyone from professional hazard management teams to formal and informal educators. SCEEP also works with the media to ensure accurate reporting of earthquake and other hazard information and to increase the public's understanding of earthquake science and earthquake seismology. As part of this program, we have developed a series of activities that can be checked out by educators for use in their classrooms and in informal education venues. These activities are designed to provide educators with the information and tools they lack to adequately, informatively, and enjoyably teach about earthquake and earth science. The toolkits contain seven activities meeting a variety of National Education Standards, not only in Science, but also in Geography, Math, Social Studies, Arts Education, History and Language Arts - providing a truly multidisciplinary toolkit for educators. The activities provide information on earthquake myths, seismic waves, elastic rebound, vectors, liquefaction, location of an epicenter, and then finally South Carolina earthquakes. The activities are engaging and inquiry based, implementing proven effective strategies for peaking learners' interest in scientific phenomena. All materials are provided within the toolkit and so it is truly check and go. While the SCEEP team has provided instructions and grade level suggestions for implementing the activity in an educational setting, the educator has full reign on what to showcase

  15. System for Capturing/Storage/Retrieval/Sharing of Toxicological Information Required for Rapid Assessment of Risks Posed By Release of CBRN Materials in the Environment

    International Nuclear Information System (INIS)

    Taylor, M. L.; Ritondo, M.; Earp Singer, L.; Rogers, J. V.; Price, J. A.; Fleming, E. J.; Chappie, D.; McGonigle, D.; Nichols, T. L.; Sonich-Mullin, C.


    The Threat and Consequence Assessment Division (TCAD) of the U.S. Environmental Protection Agency's (EPA) National Homeland Security Research Center (NHSRC) is developing methodology for performing rapid risk assessments needed for incident management, cleanup, and mitigation of hazards in the aftermath of a terrorist event. TCAD, working with the Department of Defense's Chemical and Biological Defense Information Analysis Center (CBIAC, operated by Battelle) has developed SERRA - Support for Environmental Rapid Risk Assessment. This paper describes the methodology utilized to formulate SERRA, presents current contents of the SERRA database (information derived from assessments of over 3,000 publications selected from 10,000 citations), and describes SERRA implementation. The paper also discusses how an Internet-accessible version of the SERRA database could be utilized by a country or countries to prepare for and respond to the intentional release of chemical, biological or radiological materials.(author)

  16. GEM - The Global Earthquake Model (United States)

    Smolka, A.


    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  17. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education (United States)

    de Groot, R.


    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  18. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe (United States)

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David


    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  19. Quantification of social contributions to earthquake mortality (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.


    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  20. Rapid Information and Communication Technology Assessment Team (RTAT): Enabling the Hands and Feet to Win the Hearts and Minds (United States)


    information and communication technology (ICT), information, communication, infrastructure , mobile , data collection, UN, emergency telecommunication...on the developed mobile data collection tool with automated backend server integration with the Pacific Disaster Center’s (PDC’s) DisasterAWARE web... infrastructure . This negatively impacts responders’ ability to communicate and collaborate with one another. As a result, humanitarian assistance (HA

  1. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake (United States)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao


    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  2. Rupture, waves and earthquakes. (United States)

    Uenishi, Koji


    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  3. a Collaborative Cyberinfrastructure for Earthquake Seismology (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.


    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (, the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  4. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet. (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  5. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan (United States)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi


    ( The long-term goal of this system is to provide real-time source information for rapid seismic hazard assessment during large earthquakes.

  6. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel


    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  7. Earthquakes; May-June 1982 (United States)

    Person, W.J.


    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  8. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.


    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  9. Thermal IR satellite data application for earthquake research in Pakistan (United States)

    Barkat, Adnan; Ali, Aamir; Rehman, Khaista; Awais, Muhammad; Riaz, Muhammad Shahid; Iqbal, Talat


    The scientific progress in space research indicates earthquake-related processes of surface temperature growth, gas/aerosol exhalation and electromagnetic disturbances in the ionosphere prior to seismic activity. Among them surface temperature growth calculated using the satellite thermal infrared images carries valuable earthquake precursory information for near/distant earthquakes. Previous studies have concluded that such information can appear few days before the occurrence of an earthquake. The objective of this study is to use MODIS thermal imagery data for precursory analysis of Kashmir (Oct 8, 2005; Mw 7.6; 26 km), Ziarat (Oct 28, 2008; Mw 6.4; 13 km) and Dalbandin (Jan 18, 2011; Mw 7.2; 69 km) earthquakes. Our results suggest that there exists an evident correlation of Land Surface Temperature (thermal; LST) anomalies with seismic activity. In particular, a rise of 3-10 °C in LST is observed 6, 4 and 14 days prior to Kashmir, Ziarat and Dalbandin earthquakes. In order to further elaborate our findings, we have presented a comparative and percentile analysis of daily and five years averaged LST for a selected time window with respect to the month of earthquake occurrence. Our comparative analyses of daily and five years averaged LST show a significant change of 6.5-7.9 °C for Kashmir, 8.0-8.1 °C for Ziarat and 2.7-5.4 °C for Dalbandin earthquakes. This significant change has high percentile values for the selected events i.e. 70-100% for Kashmir, 87-100% for Ziarat and 84-100% for Dalbandin earthquakes. We expect that such consistent results may help in devising an optimal earthquake forecasting strategy and to mitigate the effect of associated seismic hazards.

  10. Sensing the earthquake (United States)

    Bichisao, Marta; Stallone, Angela


    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  11. Geomorphic legacy of medieval Himalayan earthquakes in the Pokhara Valley (United States)

    Schwanghart, Wolfgang; Bernhardt, Anne; Stolle, Amelie; Hoelzmann, Philipp; Adhikari, Basanta R.; Andermann, Christoff; Tofelde, Stefanie; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver


    The Himalayas and their foreland belong to the world's most earthquake-prone regions. With millions of people at risk from severe ground shaking and associated damages, reliable data on the spatial and temporal occurrence of past major earthquakes is urgently needed to inform seismic risk analysis. Beyond the instrumental record such information has been largely based on historical accounts and trench studies. Written records provide evidence for damages and fatalities, yet are difficult to interpret when derived from the far-field. Trench studies, in turn, offer information on rupture histories, lengths and displacements along faults but involve high chronological uncertainties and fail to record earthquakes that do not rupture the surface. Thus, additional and independent information is required for developing reliable earthquake histories. Here, we present exceptionally well-dated evidence of catastrophic valley infill in the Pokhara Valley, Nepal. Bayesian calibration of radiocarbon dates from peat beds, plant macrofossils, and humic silts in fine-grained tributary sediments yields a robust age distribution that matches the timing of nearby M>8 earthquakes in ~1100, 1255, and 1344 AD. The upstream dip of tributary valley fills and X-ray fluorescence spectrometry of their provenance rule out local sediment sources. Instead, geomorphic and sedimentary evidence is consistent with catastrophic fluvial aggradation and debris flows that had plugged several tributaries with tens of meters of calcareous sediment from the Annapurna Massif >60 km away. The landscape-changing consequences of past large Himalayan earthquakes have so far been elusive. Catastrophic aggradation in the wake of two historically documented medieval earthquakes and one inferred from trench studies underscores that Himalayan valley fills should be considered as potential archives of past earthquakes. Such valley fills are pervasive in the Lesser Himalaya though high erosion rates reduce

  12. Turkish Children's Ideas about Earthquakes (United States)

    Simsek, Canan Lacin


    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  13. Earthquakes, May-June 1991 (United States)

    Person, W.J.


    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  14. Organizational changes at Earthquakes & Volcanoes (United States)

    Gordon, David W.


    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  15. The 1976 Tangshan earthquake (United States)

    Fang, Wang


    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  16. [Earthquakes in El Salvador]. (United States)

    de Ville de Goyet, C


    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  17. Can a rapid measure of self-exposure to drugs of abuse provide dimensional information on depression comorbidity? (United States)

    Butelman, Eduardo Roque; Bacciardi, Silvia; Maremmani, Angelo Giovanni Icro; Darst-Campbell, Maya; Correa da Rosa, Joel; Kreek, Mary Jeanne


    Addictions to heroin or to cocaine are associated with substantial psychiatric comorbidity, including depression. Poly-drug self-exposure (eg, to heroin, cocaine, cannabis, or alcohol) is also common, and may further affect depression comorbidity. This case-control study examined the relationship of exposure to the above drugs and depression comorbidity. Participants were recruited from methadone maintenance clinics, and from the community. Adult male and female participants (n = 1,201) were ascertained consecutively by experienced licensed clinicians. The instruments used were the SCID-I, and Kreek-McHugh-Schluger-Kellogg (KMSK) scales, which provide a rapid dimensional measure of maximal lifetime self-exposure to each of the above drugs. This measure ranges from no exposure to high unit dose, high frequency, and long duration of exposure. A multiple logistic regression with stepwise variable selection revealed that increasing exposure to heroin or to cocaine was associated greater odds of depression, with all cases and controls combined. In cases with an opioid dependence diagnosis, increasing cocaine exposure was associated with a further increase in odds of depression. However, in cases with a cocaine dependence diagnosis, increasing exposure to either cannabis or alcohol, as well as heroin, was associated with a further increase in odds of depression. This dimensional analysis of exposure to specific drugs provides insights on depression comorbidity with addictive diseases, and the impact of poly-drug exposure. A rapid analysis of exposure to drugs of abuse reveals how specific patterns of drug and poly-drug exposure are associated with increasing odds of depression. This approach detected quantitatively how different patterns of poly-drug exposure can result in increased odds of depression comorbidity, in cases diagnosed with opioid versus cocaine dependence. (Am J Addict 2017;26:632-639). © 2017 American Academy of Addiction Psychiatry.

  18. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management


    Ibrion, Mihaela


    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  19. Building the Southern California Earthquake Center (United States)

    Jordan, T. H.; Henyey, T.; McRaney, J. K.


    Kei Aki was the founding director of the Southern California Earthquake Center (SCEC), a multi-institutional collaboration formed in 1991 as a Science and Technology Center (STC) under the National Science Foundation (NSF) and the U. S. Geological Survey (USGS). Aki and his colleagues articulated a system-level vision for the Center: investigations by disciplinary working groups would be woven together into a "Master Model" for Southern California. In this presentation, we will outline how the Master-Model concept has evolved and how SCEC's structure has adapted to meet scientific challenges of system-level earthquake science. In its first decade, SCEC conducted two regional imaging experiments (LARSE I & II); published the "Phase-N" reports on (1) the Landers earthquake, (2) a new earthquake rupture forecast for Southern California, and (3) new models for seismic attenuation and site effects; it developed two prototype "Community Models" (the Crustal Motion Map and Community Velocity Model) and, perhaps most important, sustained a long-term, multi-institutional, interdisciplinary collaboration. The latter fostered pioneering numerical simulations of earthquake ruptures, fault interactions, and wave propagation. These accomplishments provided the impetus for a successful proposal in 2000 to reestablish SCEC as a "stand alone" center under NSF/USGS auspices. SCEC remains consistent with the founders' vision: it continues to advance seismic hazard analysis through a system-level synthesis that is based on community models and an ever expanding array of information technology. SCEC now represents a fully articulated "collaboratory" for earthquake science, and many of its features are extensible to other active-fault systems and other system-level collaborations. We will discuss the implications of the SCEC experience for EarthScope, the USGS's program in seismic hazard analysis, NSF's nascent Cyberinfrastructure Initiative, and other large collaboratory programs.

  20. What is the earthquake fracture energy? (United States)

    Di Toro, G.; Nielsen, S. B.; Passelegue, F. X.; Spagnuolo, E.; Bistacchi, A.; Fondriest, M.; Murphy, S.; Aretusini, S.; Demurtas, M.


    The energy budget of an earthquake is one of the main open questions in earthquake physics. During seismic rupture propagation, the elastic strain energy stored in the rock volume that bounds the fault is converted into (1) gravitational work (relative movement of the wall rocks bounding the fault), (2) in- and off-fault damage of the fault zone rocks (due to rupture propagation and frictional sliding), (3) frictional heating and, of course, (4) seismic radiated energy. The difficulty in the budget determination arises from the measurement of some parameters (e.g., the temperature increase in the slipping zone which constraints the frictional heat), from the not well constrained size of the energy sinks (e.g., how large is the rock volume involved in off-fault damage?) and from the continuous exchange of energy from different sinks (for instance, fragmentation and grain size reduction may result from both the passage of the rupture front and frictional heating). Field geology studies, microstructural investigations, experiments and modelling may yield some hints. Here we discuss (1) the discrepancies arising from the comparison of the fracture energy measured in experiments reproducing seismic slip with the one estimated from seismic inversion for natural earthquakes and (2) the off-fault damage induced by the diffusion of frictional heat during simulated seismic slip in the laboratory. Our analysis suggests, for instance, that the so called earthquake fracture energy (1) is mainly frictional heat for small slips and (2), with increasing slip, is controlled by the geometrical complexity and other plastic processes occurring in the damage zone. As a consequence, because faults are rapidly and efficiently lubricated upon fast slip initiation, the dominant dissipation mechanism in large earthquakes may not be friction but be the off-fault damage due to fault segmentation and stress concentrations in a growing region around the fracture tip.

  1. The mechanism of earthquake (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing


    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  2. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena


    very low rupture velocity. The low rupture velocity can mean slow-faulting, which brings to slow release of accumulated seismic energy. The slow release energy does principally little to moderate damages. Additionally wave form of the earthquake shows low frequency content of P-waves (the maximum P-wave is at 1.19 Hz) and the specific P- wave displacement spectral is characterise with not expressed spectrum plateau and corner frequency. These and other signs suggest us to the conclusion, that the 2012 Mw5.6 earthquake can be considered as types of slow earthquake, like a low frequency quake. The study is based on data from Bulgarian seismological network (NOTSSI), the local network (LSN) deployed around Kozloduy NPP and System of Accelerographs for Seismic Monitoring of Equipment and Structures (SASMES) installed in the Kozloduy NPP. NOTSSI jointly with LSN and SASMES provide reliable information for multiple studies on seismicity in regional scale.

  3. Human casualties in earthquakes: Modelling and mitigation (United States)

    Spence, R.J.S.; So, E.K.M.


    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  4. Improving PAGER's real-time earthquake casualty and loss estimation toolkit: a challenge (United States)

    Jaiswal, K.S.; Wald, D.J.


    We describe the on-going developments of PAGER’s loss estimation models, and discuss value-added web content that can be generated related to exposure, damage and loss outputs for a variety of PAGER users. These developments include identifying vulnerable building types in any given area, estimating earthquake-induced damage and loss statistics by building type, and developing visualization aids that help locate areas of concern for improving post-earthquake response efforts. While detailed exposure and damage information is highly useful and desirable, significant improvements are still necessary in order to improve underlying building stock and vulnerability data at a global scale. Existing efforts with the GEM’s GED4GEM and GVC consortia will help achieve some of these objectives. This will benefit PAGER especially in regions where PAGER’s empirical model is less-well constrained; there, the semi-empirical and analytical models will provide robust estimates of damage and losses. Finally, we outline some of the challenges associated with rapid casualty and loss estimation that we experienced while responding to recent large earthquakes worldwide.

  5. Rapid survey protocol that provides dynamic information on reef condition to managers of the Great Barrier Reef. (United States)

    Beeden, R J; Turner, M A; Dryden, J; Merida, F; Goudkamp, K; Malone, C; Marshall, P A; Birtles, A; Maynard, J A


    Managing to support coral reef resilience as the climate changes requires strategic and responsive actions that reduce anthropogenic stress. Managers can only target and tailor these actions if they regularly receive information on system condition and impact severity. In large coral reef areas like the Great Barrier Reef Marine Park (GBRMP), acquiring condition and impact data with good spatial and temporal coverage requires using a large network of observers. Here, we describe the result of ~10 years of evolving and refining participatory monitoring programs used in the GBR that have rangers, tourism operators and members of the public as observers. Participants complete Reef Health and Impact Surveys (RHIS) using a protocol that meets coral reef managers' needs for up-to-date information on the following: benthic community composition, reef condition and impacts including coral diseases, damage, predation and the presence of rubbish. Training programs ensure that the information gathered is sufficiently precise to inform management decisions. Participants regularly report because the demands of the survey methodology have been matched to their time availability. Undertaking the RHIS protocol we describe involves three ~20 min surveys at each site. Participants enter data into an online data management system that can create reports for managers and participants within minutes of data being submitted. Since 2009, 211 participants have completed a total of more than 10,415 surveys at more than 625 different reefs. The two-way exchange of information between managers and participants increases the capacity to manage reefs adaptively, meets education and outreach objectives and can increase stewardship. The general approach used and the survey methodology are both sufficiently adaptable to be used in all reef regions.

  6. Rapid microsatellite marker development using next generation pyrosequencing to inform invasive Burmese python -- Python molurus bivittatus -- management (United States)

    Hunter, Margaret E.; Hart, Kristen M.


    Invasive species represent an increasing threat to native ecosystems, harming indigenous taxa through predation, habitat modification, cross-species hybridization and alteration of ecosystem processes. Additionally, high economic costs are associated with environmental damage, restoration and control measures. The Burmese python, Python molurus bivittatus, is one of the most notable invasive species in the US, due to the threat it poses to imperiled species and the Greater Everglades ecosystem. To address population structure and relatedness, next generation sequencing was used to rapidly produce species-specific microsatellite loci. The Roche 454 GS-FLX Titanium platform provided 6616 di-, tri- and tetra-nucleotide repeats in 117,516 sequences. Using stringent criteria, 24 of 26 selected tri- and tetra-nucleotide loci were polymerase chain reaction (PCR) amplified and 18 were polymorphic. An additional six cross-species loci were amplified, and the resulting 24 loci were incorporated into eight PCR multiplexes. Multi-locus genotypes yielded an average of 61% (39%–77%) heterozygosity and 3.7 (2–6) alleles per locus. Population-level studies using the developed microsatellites will track the invasion front and monitor population-suppression dynamics. Additionally, cross-species amplification was detected in the invasive Ball, P. regius, and Northern African python, P. sebae. These markers can be used to address the hybridization potential of Burmese pythons and the larger, more aggressive P. sebae.

  7. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich


    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  8. The EM Earthquake Precursor (United States)

    Jones, K. B., II; Saxton, P. T.


    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  9. Earthquake Source Spectral Study beyond the Omega-Square Model (United States)

    Uchide, T.; Imanishi, K.


    Earthquake source spectra have been used for characterizing earthquake source processes quantitatively and, at the same time, simply, so that we can analyze the source spectra for many earthquakes, especially for small earthquakes, at once and compare them each other. A standard model for the source spectra is the omega-square model, which has the flat spectrum and the falloff inversely proportional to the square of frequencies at low and high frequencies, respectively, which are bordered by a corner frequency. The corner frequency has often been converted to the stress drop under the assumption of circular crack models. However, recent studies claimed the existence of another corner frequency [Denolle and Shearer, 2016; Uchide and Imanishi, 2016] thanks to the recent development of seismic networks. We have found that many earthquakes in areas other than the area studied by Uchide and Imanishi [2016] also have source spectra deviating from the omega-square model. Another part of the earthquake spectra we now focus on is the falloff rate at high frequencies, which will affect the seismic energy estimation [e.g., Hirano and Yagi, 2017]. In June, 2016, we deployed seven velocity seismometers in the northern Ibaraki prefecture, where the shallow crustal seismicity mainly with normal-faulting events was activated by the 2011 Tohoku-oki earthquake. We have recorded seismograms at 1000 samples per second and at a short distance from the source, so that we can investigate the high-frequency components of the earthquake source spectra. Although we are still in the stage of discovery and confirmation of the deviation from the standard omega-square model, the update of the earthquake source spectrum model will help us systematically extract more information on the earthquake source process.

  10. Deterioration of the informal tank institution in Tamil Nadu: caste-based rural society and rapid economic development in India


    JEGADEESAN, Muniandi; FUJITA, Koichi


    The informal tank institution seems to have functioned relatively well in Tamil Nadu, India, at least until the early 1970s. The institution had been supported by three layers of irrigation functionaries at village level. Especially important was the role of the lower irrigation functionaries such as the water-turner (Neerkatti), who had been conducting important tasks such as sluice operation, field water management and others. Based on the authors’ recent field survey in seven tank-benefitt...

  11. A rapid assessment of drinking water quality in informal settlements after a cholera outbreak in Nairobi, Kenya. (United States)

    Blanton, Elizabeth; Wilhelm, Natalie; O'Reilly, Ciara; Muhonja, Everline; Karoki, Solomon; Ope, Maurice; Langat, Daniel; Omolo, Jared; Wamola, Newton; Oundo, Joseph; Hoekstra, Robert; Ayers, Tracy; De Cock, Kevin; Breiman, Robert; Mintz, Eric; Lantagne, Daniele


    Populations living in informal settlements with inadequate water and sanitation infrastructure are at risk of epidemic disease. In 2010, we conducted 398 household surveys in two informal settlements in Nairobi, Kenya with isolated cholera cases. We tested source and household water for free chlorine residual (FCR) and Escherichia coli in approximately 200 households. International guidelines are ≥0.5 mg/L FCR at source, ≥0.2 mg/L at household, and settlements, 82% and 38% of water sources met FCR guidelines; and 7% and 8% were contaminated with E. coli, respectively. In household stored water, 82% and 35% met FCR guidelines and 11% and 32% were contaminated with E. coli, respectively. Source water FCR≥0.5 mg/L (p=0.003) and reported purchase of a household water treatment product (p=0.002) were associated with increases in likelihood that household stored water had ≥0.2 mg/L FCR, which was associated with a lower likelihood of E. coli contamination (psettlements is universally poor and the route of disease transmission, and highlight that providing centralized water with ≥0.5 mg/L FCR or (if not feasible) household water treatment technologies reduces the risk of waterborne cholera transmission in informal settlements.

  12. Earthquakes and faults in southern California (1970-2010) (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.


    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  13. Earthquake Safety Tips in the Classroom (United States)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.


    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  14. Fighting and preventing post-earthquake fires in nuclear power plant

    International Nuclear Information System (INIS)

    Lu Xuefeng; Zhang Xin


    Nuclear power plant post-earthquake fires will cause not only personnel injury, severe economic loss, but also serious environmental pollution. For the moment, nuclear power is in a position of rapid development in China. Considering the earthquake-prone characteristics of our country, it is of great engineering importance to investigate the nuclear power plant post-earthquake fires. This article analyzes the cause, influential factors and development characteristics of nuclear power plant post-earthquake fires in details, and summarizes the three principles should be followed in fighting and preventing nuclear power plant post-earthquake fires, such as solving problems in order of importance and urgency, isolation prior to prevention, immediate repair and regular patrol. Three aspects were pointed out that should be paid attention in fighting and preventing post-earthquake fires. (authors)

  15. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.


    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  16. Keeping focus on earthquakes at school for seismic risk mitigation of the next generations (United States)

    Saraò, Angela; Barnaba, Carla; Peruzza, Laura


    The knowledge of the seismic history of its own territory, the understanding of physical phenomena in response to an earthquake, the changes in the cultural heritage following a strong earthquake, the learning of actions to be taken during and after an earthquake, are piece of information that contribute to keep focus on the seismic hazard and to implement strategies for seismic risk mitigation. The training of new generations, today more than ever subject to rapid forgetting of past events, becomes therefore a key element to increase the perception that earthquakes happened and can happen at anytime and that mitigation actions are the only means to ensure the safety and to reduce damages and human losses. Since several years our institute (OGS) is involved in activities to raise awareness of education on earthquake. We aim to implement education programs with the goal of addressing a critical approach to seismic hazard reduction, differentiating the types of activities according to the age of the students. However, being such kind of activity unfunded, we can act at now only on a very limited number of schools per year. To be effective, the inclusion of the seismic risk issues in school curricula requires specific time and appropriate approaches when planning activities. For this reason, we involve also the teachers as proponents of activities and we encourage them to keep alive memories and discussion on earthquake in the classes. During the past years we acted mainly in the schools of the Friuli Venezia Giulia area (NE Italy), that is an earthquake prone area struck in 1976 by a destructive seismic event (Ms=6.5). We organized short training courses for teachers, we lectured classes, and we led laboratory activities with students. Indeed, being well known that students enjoy classes more when visual and active learning are joined, we propose a program that is composed by seminars, demonstrations and hands-on activities in the classrooms; for high school students

  17. A moment-tensor catalog for intermediate magnitude earthquakes in Mexico (United States)

    Rodríguez Cardozo, Félix; Hjörleifsdóttir, Vala; Martínez-Peláez, Liliana; Franco, Sara; Iglesias Mendoza, Arturo


    Located among five tectonic plates, Mexico is one of the world's most seismically active regions. The earthquake focal mechanisms provide important information on the active tectonics. A widespread technique for estimating the earthquake magnitud and focal mechanism is the inversion for the moment tensor, obtained by minimizing a misfit function that estimates the difference between synthetic and observed seismograms. An important element in the estimation of the moment tensor is an appropriate velocity model, which allows for the calculation of accurate Green's Functions so that the differences between observed and synthetics seismograms are due to the source of the earthquake rather than the velocity model. However, calculating accurate synthetic seismograms gets progressively more difficult as the magnitude of the earthquakes decreases. Large earthquakes (M>5.0) excite waves of longer periods that interact weakly with lateral heterogeneities in the crust. For these events, using 1D velocity models to compute Greens functions works well and they are well characterized by seismic moment tensors reported in global catalogs (eg. USGS fast moment tensor solutions and GCMT). The opposite occurs for small and intermediate sized events, where the relatively shorter periods excited interact strongly with lateral heterogeneities in the crust and upper mantle. To accurately model the Green's functions for the smaller events in a large heterogeneous area, requires 3D or regionalized 1D models. To obtain a rapid estimate of earthquake magnitude, the National Seismological Survey in Mexico (Servicio Sismológico Nacional, SSN) automatically calculates seismic moment tensors for events in the Mexican Territory (Franco et al., 2002; Nolasco-Carteño, 2006). However, for intermediate-magnitude and small earthquakes the signal-to-noise ratio could is low for many of the seismic stations, and without careful selection and filtering of the data, obtaining a stable focal mechanism

  18. The USGS plan for short-term prediction of the anticipated Parkfield earthquake (United States)

    Bakun, W.H.


    Aside from the goal of better understanding the Parkfield earthquake cycle, it is the intention of the U.S Geological Survey to attempt to issue a warning shortly before the anticipated earthquake. Although short-term earthquake warnings are not yet generally feasible, the wealth of information available for the previous significant Parkfield earthquakes suggests that if the next earthquake follows the pattern of "characteristic" Parkfield shocks, such a warning might be possible. Focusing on earthquake precursors reported for the previous  "characteristic" shocks, particulary the 1934 and 1966 events, the USGS developed a plan* in late 1985 on which to base earthquake warnings for Parkfield and has assisted State, county, and local officials in the Parkfield area to prepare a coordinated, reasonable response to a warning, should one be issued. 

  19. Memory effect in M ≥ 6 earthquakes of South-North Seismic Belt, Mainland China (United States)

    Wang, Jeen-Hwa


    The M ≥ 6 earthquakes occurred in the South-North Seismic Belt, Mainland China, during 1901-2008 are taken to study the possible existence of memory effect in large earthquakes. The fluctuation analysis technique is applied to analyze the sequences of earthquake magnitude and inter-event time represented in the natural time domain. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of earthquake magnitude and inter-event time. The migration of earthquakes in study is taken to discuss the possible correlation between events. The phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Together with all kinds of given information, we conclude that the earthquakes in study is short-term correlated and thus the short-term memory effect would be operative.

  20. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology


    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı


    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  1. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone (United States)

    Parsons, Tom


    Triggered earthquakes can be large, damaging, and lethal as evidenced by the1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ˜39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ˜7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  2. Statistical aspects and risks of human-caused earthquakes (United States)

    Klose, C. D.


    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  3. Large earthquake rates from geologic, geodetic, and seismological perspectives (United States)

    Jackson, D. D.


    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  4. Earthquake Preparedness Among Japanese Hemodialysis Patients in Prefectures Heavily Damaged by the 2011 Great East Japan Earthquake. (United States)

    Sugisawa, Hidehiro; Shimizu, Yumiko; Kumagai, Tamaki; Sugisaki, Hiroaki; Ohira, Seiji; Shinoda, Toshio


    The purpose of this study was to explore the factors related to earthquake preparedness in Japanese hemodialysis patients. We focused on three aspects of the related factors: health condition factors, social factors, and the experience of disasters. A mail survey of all the members of the Japan Association of Kidney Disease Patients in three Japanese prefectures (N = 4085) was conducted in March, 2013. We obtained 1841 valid responses for analysis. The health factors covered were: activities of daily living (ADL), mental distress, primary renal diseases, and the duration of dialysis. The social factors were: socioeconomic status, family structure, informational social support, and the provision of information regarding earthquake preparedness from dialysis facilities. The results show that the average percentage of participants that had met each criterion of earthquake preparedness in 2013 was 53%. Hemodialysis patients without disabled ADL, without mental distress, and requiring longer periods of dialysis, were likely to meet more of the earthquake preparedness criteria. Hemodialysis patients who had received informational social support from family or friends, had lived with spouse and children in comparison to living alone, and had obtained information regarding earthquake preparedness from dialysis facilities, were also likely to meet more of the earthquake preparedness criteria. © 2017 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  5. Romanian earthquakes analysis using BURAR seismic array

    International Nuclear Information System (INIS)

    Borleanu, Felix; Rogozea, Maria; Nica, Daniela; Popescu, Emilia; Popa, Mihaela; Radulian, Mircea


    Bucovina seismic array (BURAR) is a medium-aperture array, installed in 2002 in the northern part of Romania (47.61480 N latitude, 25.21680 E longitude, 1150 m altitude), as a result of the cooperation between Air Force Technical Applications Center, USA and National Institute for Earth Physics, Romania. The array consists of ten elements, located in boreholes and distributed over a 5 x 5 km 2 area; nine with short-period vertical sensors and one with a broadband three-component sensor. Since the new station has been operating the earthquake survey of Romania's territory has been significantly improved. Data recorded by BURAR during 01.01.2005 - 12.31.2005 time interval are first processed and analyzed, in order to establish the array detection capability of the local earthquakes, occurred in different Romanian seismic zones. Subsequently a spectral ratios technique was applied in order to determine the calibration relationships for magnitude, using only the information gathered by BURAR station. The spectral ratios are computed relatively to a reference event, considered as representative for each seismic zone. This method has the advantage to eliminate the path effects. The new calibration procedure is tested for the case of Vrancea intermediate-depth earthquakes and proved to be very efficient in constraining the size of these earthquakes. (authors)

  6. Real-time risk assessment in seismic early warning and rapid response: a feasibility study in Bishkek (Kyrgyzstan) (United States)

    Picozzi, M.; Bindi, D.; Pittore, M.; Kieling, K.; Parolai, S.


    Earthquake early warning systems (EEWS) are considered to be an effective, pragmatic, and viable tool for seismic risk reduction in cities. While standard EEWS approaches focus on the real-time estimation of an earthquake's location and magnitude, innovative developments in EEWS include the capacity for the rapid assessment of damage. Clearly, for all public authorities that are engaged in coordinating emergency activities during and soon after earthquakes, real-time information about the potential damage distribution within a city is invaluable. In this work, we present a first attempt to design an early warning and rapid response procedure for real-time risk assessment. In particular, the procedure uses typical real-time information (i.e., P-wave arrival times and early waveforms) derived from a regional seismic network for locating and evaluating the size of an earthquake, information which in turn is exploited for extracting a risk map representing the potential distribution of damage from a dataset of predicted scenarios compiled for the target city. A feasibility study of the procedure is presented for the city of Bishkek, the capital of Kyrgyzstan, which is surrounded by the Kyrgyz seismic network by mimicking the ground motion associated with two historical events that occurred close to Bishkek, namely the 1911 Kemin ( M = 8.2; ±0.2) and the 1885 Belovodsk ( M = 6.9; ±0.5) earthquakes. Various methodologies from previous studies were considered when planning the implementation of the early warning and rapid response procedure for real-time risk assessment: the Satriano et al. (Bull Seismol Soc Am 98(3):1482-1494, 2008) approach to real-time earthquake location; the Caprio et al. (Geophys Res Lett 38:L02301, 2011) approach for estimating moment magnitude in real time; the EXSIM method for ground motion simulation (Motazedian and Atkinson, Bull Seismol Soc Am 95:995-1010, 2005); the Sokolov (Earthquake Spectra 161: 679-694, 2002) approach for estimating

  7. Fire and earthquake counter measures in radiation handling facilities

    International Nuclear Information System (INIS)


    'Fire countermeasures in radiation handling facilities' published in 1961 is still widely utilized as a valuable guideline for those handling radiation through the revision in 1972. However, science and technology rapidly advanced, and the relevant laws were revised after the publication, and many points which do not conform to the present state have become to be found. Therefore, it was decided to rewrite this book, and the new book has been completed. The title was changed to 'Fire and earthquake countermeasures in radiation handling facilities', and the countermeasures to earthquakes were added. Moreover, consideration was given so that the book is sufficiently useful also for those concerned with fire fighting, not only for those handling radiation. In this book, the way of thinking about the countermeasures against fires and earthquakes, the countermeasures in normal state and when a fire or an earthquake occurred, the countermeasures when the warning declaration has been announced, and the data on fires, earthquakes, the risk of radioisotopes, fire fighting equipment, the earthquake counter measures for equipment, protectors and radiation measuring instruments, first aid, the example of emergency system in radiation handling facilities, the activities of fire fighters, the example of accidents and so on are described. (Kako, I.)

  8. Historical earthquake research in Austria (United States)

    Hammerl, Christa


    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  9. Survey of damage to 602 MR scanners after the Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Nakai, Toshiharu; Yamaguchi-Sekino, Sachiko; Tsuchihashi, Toshio


    An earthquake of 9.0 magnitude, the largest in modern Japanese history, struck east Japan on March 11, 2011. We investigated hazards and observations related to magnetic resonance (MR) scanners in this earthquake to evaluate potential risks and consider further prevention or minimization of damage from and injury of patients in such large earthquakes. The investigation team funded by MHLW sent questionnaires to the 984 facilities with installed MR scanners in the 7 prefectures of east Japan (Iwate, Miyagi, Fukushima, Ibaraki, Chiba, Tokyo, Saitama) and collected 458 responses (46.6%) with information on 602 MR scanners (144 units≤0.5 tesla; 31 one-T units; 371 1.5-T units; and 56 units≥3 T). Significant differences in damage were observed between seismic scale 5 and 6 (χ 2 test, P<0.001 for all items of damage checked). The frequencies of typical damage were displacement of magnets (12.4%), failure of the chiller or air conditioning (9.6%), rapid decrease in liquid helium (8.4%), damage to magnet enclosure and its equipment (7.6%), damage to shielding of the MR scanner room (6.1%), damage to the quench duct (4.5%), breakage of devices anchoring system cabinets (4.4%), damage to the magnet base (3.9%), and flying of metal components (1.5%). Twelve facilities reported flooding by the subsequent tsunami, and quench was confirmed in 19 facilities. No fire damage was reported. It was confirmed that no one was severely injured in MR scanners, and base isolation of the building was very useful in completely preventing damage even at seismic scale 7. In the future, training for evacuation and establishment of a standard protocol for emergency shutdown of MR scanners, onsite checking by MR operators, and emergency power plant equipment to maintain chiller for MR scanners will further ensure MR safety in an earthquake. (author)

  10. Posttraumatic stress disorder: a serious post-earthquake complication. (United States)

    Farooqui, Mudassir; Quadri, Syed A; Suriya, Sajid S; Khan, Muhammad Adnan; Ovais, Muhammad; Sohail, Zohaib; Shoaib, Samra; Tohid, Hassaan; Hassan, Muhammad


    Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD) occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD. A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD), posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  11. Posttraumatic stress disorder: a serious post-earthquake complication

    Directory of Open Access Journals (Sweden)

    Mudassir Farooqui

    Full Text Available Abstract Objectives Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD Method A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD, posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. Results It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. Conclusion The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  12. Engineering aspects of earthquake risk mitigation: Lessons from management of recent earthquakes, and consequential mudflows and landslides

    International Nuclear Information System (INIS)


    The Proceedings contain 30 selected presentations given at the Second and Third UNDRO/USSR Training Seminars: Engineering Aspects of Earthquake Risk Assessment and Mitigation of Losses, held in Dushanbe, October 1988; and Lessons from Management of Recent Earthquakes, and Consequential Mudflows and Landslides, held in Moscow, October 1989. The annexes to the document provide information on the participants, the work programme and the resolution adopted at each of the seminars. Refs, figs and tabs

  13. Identified EM Earthquake Precursors (United States)

    Jones, Kenneth, II; Saxton, Patrick


    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  14. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara


    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  15. Earthquake Risk Mitigation in the Tokyo Metropolitan area (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.


    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  16. Fault lubrication during earthquakes. (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T


    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  17. Housing Damage Following Earthquake (United States)


    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  18. Ion torrent personal genome machine sequencing for genomic typing of Neisseria meningitidis for rapid determination of multiple layers of typing information. (United States)

    Vogel, Ulrich; Szczepanowski, Rafael; Claus, Heike; Jünemann, Sebastian; Prior, Karola; Harmsen, Dag


    Neisseria meningitidis causes invasive meningococcal disease in infants, toddlers, and adolescents worldwide. DNA sequence-based typing, including multilocus sequence typing, analysis of genetic determinants of antibiotic resistance, and sequence typing of vaccine antigens, has become the standard for molecular epidemiology of the organism. However, PCR of multiple targets and consecutive Sanger sequencing provide logistic constraints to reference laboratories. Taking advantage of the recent development of benchtop next-generation sequencers (NGSs) and of BIGSdb, a database accommodating and analyzing genome sequence data, we therefore explored the feasibility and accuracy of Ion Torrent Personal Genome Machine (PGM) sequencing for genomic typing of meningococci. Three strains from a previous meningococcus serogroup B community outbreak were selected to compare conventional typing results with data generated by semiconductor chip-based sequencing. In addition, sequencing of the meningococcal type strain MC58 provided information about the general performance of the technology. The PGM technology generated sequence information for all target genes addressed. The results were 100% concordant with conventional typing results, with no further editing being necessary. In addition, the amount of typing information, i.e., nucleotides and target genes analyzed, could be substantially increased by the combined use of genome sequencing and BIGSdb compared to conventional methods. In the near future, affordable and fast benchtop NGS machines like the PGM might enable reference laboratories to switch to genomic typing on a routine basis. This will reduce workloads and rapidly provide information for laboratory surveillance, outbreak investigation, assessment of vaccine preventability, and antibiotic resistance gene monitoring.

  19. Do Earthquakes Shake Stock Markets? (United States)

    Ferreira, Susana; Karali, Berna


    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  20. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya


    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  1. Inversion of GPS-measured coseismic displacements for source parameters of Taiwan earthquake (United States)

    Lin, J. T.; Chang, W. L.; Hung, H. K.; Yu, W. C.


    We performed a method of determining earthquake location, focal mechanism, and centroid moment tensor by coseismic surface displacements from daily and high-rate GPS measurements. Unlike commonly used dislocation model where fault geometry is calculated nonlinearly, our method makes a point source approach to evaluate these parameters in a solid and efficient way without a priori fault information and can thus provide constrains to subsequent finite source modeling of fault slip. In this study, we focus on the resolving ability of GPS data for moderate (Mw=6.0 7.0) earthquakes in Taiwan, and four earthquakes were investigated in detail: the March 27 2013 Nantou (Mw=6.0), the June 2 2013 Nantou (Mw=6.3) , the October 31 2013 Ruisui (Mw=6.3), and the March 31 2002 Hualien (ML=6.8) earthquakes. All these events were recorded by the Taiwan continuous GPS network with data sampling rates of 30-second and 1 Hz, where the Mw6.3 Ruisui earthquake was additionally recorded by another local GPS network with a sampling rate of 20 Hz. Our inverted focal mechanisms of all these earthquakes are consistent with the results of GCMT and USGS that evaluates source parameters by dynamic information from seismic waves. We also successfully resolved source parameters of the Mw6.3 Ruisui earthquake within only 10 seconds following the earthquake occurrence, demonstrating the potential of high-rate GPS data on earthquake early warning and real-time determination of earthquake source parameters.

  2. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun


    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  3. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence (United States)

    Ratliff, J. L.; Porter, K.


    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  4. Dynamic strains for earthquake source characterization (United States)

    Barbour, Andrew J.; Crowell, Brendan W


    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  5. The 2016 Central Italy Earthquake: an Overview (United States)

    Amato, A.


    The M6 central Italy earthquake occurred on the seismic backbone of the Italy, just in the middle of the highest hazard belt. The shock hit suddenly during the night of August 24, when people were asleep; no foreshocks occurred before the main event. The earthquake ruptured from 10 km to the surface, and produced a more than 17,000 aftershocks (Oct. 19) spread on a 40x20 km2 area elongated NW-SE. It is geologically very similar to previous recent events of the Apennines. Both the 2009 L'Aquila earthquake to the south and the 1997 Colfiorito to the north, were characterized by the activation of adjacent fault segments. Despite its magnitude and the well known seismic hazard of the region, the earthquake produced extensive damage and 297 fatalities. The town of Amatrice, that paid the highest toll, was classified in zone 1 (the highest) since 1915, but the buildings in this and other villages revealed highly vulnerable. In contrast, in the town of Norcia, that also experienced strong ground shaking, no collapses occurred, most likely due to the retrofitting carried out after an earthquake in 1979. Soon after the quake, the INGV Crisis Unit convened at night in the Rome headquarters, in order to coordinate the activities. The first field teams reached the epicentral area at 7 am with the portable seismic stations installed to monitor the aftershocks; other teams followed to map surface faults, damage, to measure GPS sites, to install instruments for site response studies, and so on. The INGV Crisis Unit includes the Press office and the INGVterremoti team, in order to manage and coordinate the communication towards the Civil Protection Dept. (DPC), the media and the web. Several tens of reports and updates have been delivered in the first month of the sequence to DPC. Also due to the controversial situation arisen from the L'Aquila earthquake and trials, particular attention was given to the communication: continuous and timely information has been released to

  6. The severity of an earthquake (United States)



    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  7. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi


    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  8. Quasi real-time estimation of the moment magnitude of large earthquake from static strain changes (United States)

    Itaba, S.


    The 2011 Tohoku-Oki (off the Pacific coast of Tohoku) earthquake, of moment magnitude 9.0, was accompanied by large static strain changes (10-7), as measured by borehole strainmeters operated by the Geological Survey of Japan in the Tokai, Kii Peninsula, and Shikoku regions. A fault model for the earthquake on the boundary between the Pacific and North American plates, based on these borehole strainmeter data, yielded a moment magnitude of 8.7. On the other hand, based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency (JMA) announced just after earthquake occurrence was 7.9. Such geodetic moment magnitudes, derived from static strain changes, can be estimated almost as rapidly as determinations using seismic waves. I have to verify the validity of this method in some cases. In the case of this earthquake's largest aftershock, which occurred 29 minutes after the mainshock. The prompt report issued by JMA assigned this aftershock a magnitude of 7.3, whereas the moment magnitude derived from borehole strain data is 7.6, which is much closer to the actual moment magnitude of 7.7. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami. Our simple method of using static strain changes is one of the strong methods for rapid estimation of the magnitude of large earthquakes, and useful to improve the accuracy of Earthquake Early Warning.

  9. SU-E-T-23: A Developing Australian Network for Datamining and Modelling Routine Radiotherapy Clinical Data and Radiomics Information for Rapid Learning and Clinical Decision Support

    Energy Technology Data Exchange (ETDEWEB)

    Thwaites, D [University of Sydney, Camperdown, Sydney (Australia); Holloway, L [Ingham Institute, Sydney, NSW (Australia); Bailey, M; Carolan, M; Miller, A [Illawarra Cancer Care Centre, Wollongong, NSW (Australia); Barakat, S; Field, M [University of Sydney, Sydney, NSW (Australia); Delaney, G; Vinod, S [Liverpool Hospital, Liverpool, NSW (Australia); Dekker, A [Maastro Clinic, Maastricht (Netherlands); Lustberg, T; Soest, J van; Walsh, S [MAASTRO Clinic, Maastricht (Netherlands)


    Purpose: Large amounts of routine radiotherapy (RT) data are available, which can potentially add clinical evidence to support better decisions. A developing collaborative Australian network, with a leading European partner, aims to validate, implement and extend European predictive models (PMs) for Australian practice and assess their impact on future patient decisions. Wider objectives include: developing multi-institutional rapid learning, using distributed learning approaches; and assessing and incorporating radiomics information into PMs. Methods: Two initial standalone pilots were conducted; one on NSCLC, the other on larynx, patient datasets in two different centres. Open-source rapid learning systems were installed, for data extraction and mining to collect relevant clinical parameters from the centres’ databases. The European DSSs were learned (“training cohort”) and validated against local data sets (“clinical cohort”). Further NSCLC studies are underway in three more centres to pilot a wider distributed learning network. Initial radiomics work is underway. Results: For the NSCLC pilot, 159/419 patient datasets were identified meeting the PM criteria, and hence eligible for inclusion in the curative clinical cohort (for the larynx pilot, 109/125). Some missing data were imputed using Bayesian methods. For both, the European PMs successfully predicted prognosis groups, but with some differences in practice reflected. For example, the PM-predicted good prognosis NSCLC group was differentiated from a combined medium/poor prognosis group (2YOS 69% vs. 27%, p<0.001). Stage was less discriminatory in identifying prognostic groups. In the good prognosis group two-year overall survival was 65% in curatively and 18% in palliatively treated patients. Conclusion: The technical infrastructure and basic European PMs support prognosis prediction for these Australian patient groups, showing promise for supporting future personalized treatment decisions

  10. SU-E-T-23: A Developing Australian Network for Datamining and Modelling Routine Radiotherapy Clinical Data and Radiomics Information for Rapid Learning and Clinical Decision Support

    International Nuclear Information System (INIS)

    Thwaites, D; Holloway, L; Bailey, M; Carolan, M; Miller, A; Barakat, S; Field, M; Delaney, G; Vinod, S; Dekker, A; Lustberg, T; Soest, J van; Walsh, S


    Purpose: Large amounts of routine radiotherapy (RT) data are available, which can potentially add clinical evidence to support better decisions. A developing collaborative Australian network, with a leading European partner, aims to validate, implement and extend European predictive models (PMs) for Australian practice and assess their impact on future patient decisions. Wider objectives include: developing multi-institutional rapid learning, using distributed learning approaches; and assessing and incorporating radiomics information into PMs. Methods: Two initial standalone pilots were conducted; one on NSCLC, the other on larynx, patient datasets in two different centres. Open-source rapid learning systems were installed, for data extraction and mining to collect relevant clinical parameters from the centres’ databases. The European DSSs were learned (“training cohort”) and validated against local data sets (“clinical cohort”). Further NSCLC studies are underway in three more centres to pilot a wider distributed learning network. Initial radiomics work is underway. Results: For the NSCLC pilot, 159/419 patient datasets were identified meeting the PM criteria, and hence eligible for inclusion in the curative clinical cohort (for the larynx pilot, 109/125). Some missing data were imputed using Bayesian methods. For both, the European PMs successfully predicted prognosis groups, but with some differences in practice reflected. For example, the PM-predicted good prognosis NSCLC group was differentiated from a combined medium/poor prognosis group (2YOS 69% vs. 27%, p<0.001). Stage was less discriminatory in identifying prognostic groups. In the good prognosis group two-year overall survival was 65% in curatively and 18% in palliatively treated patients. Conclusion: The technical infrastructure and basic European PMs support prognosis prediction for these Australian patient groups, showing promise for supporting future personalized treatment decisions

  11. Amplitude of foreshocks as a possible seismic precursor to earthquakes (United States)

    Lindh, A.G.


    In recent years, we have made significant progress in being able to recognize the long-range pattern of events that precede large earthquakes. For example, in a recent issue of the Earthquake Information Bulletin, we saw how the pioneering work of S.A. Fedotov of the U.S.S.R in the Kamchatka-Kurile Islands region has been applied worldwide to forecast where large, shallow earthquakes might occur in the next decades. Indeed, such a "seismic gap" off the coast of Alaska was filled by the 1972 Sitka earthquake. Promising results are slowly accumulating from other techniques that suggest that intermediate-term precursors might also be seen: among these are tilt and geomagnetic anomalies and anomalous land uplift. But the crucial point remains that short-term precursors (days to hours) will be needed in many cases if there is to be a significant saving of lives. 

  12. FEMA's Earthquake Incident Journal: A Web-Based Data Integration and Decision Support Tool for Emergency Management (United States)

    Jones, M.; Pitts, R.


    For emergency managers, government officials, and others who must respond to rapidly changing natural disasters, timely access to detailed information related to affected terrain, population and infrastructure is critical for planning, response and recovery operations. Accessing, analyzing and disseminating such disparate information in near real-time are critical decision support components. However, finding a way to handle a variety of informative yet complex datasets poses a challenge when preparing for and responding to disasters. Here, we discuss the implementation of a web-based data integration and decision support tool for earthquakes developed by the Federal Emergency Management Agency (FEMA) as a solution to some of these challenges. While earthquakes are among the most well- monitored and measured of natural hazards, the spatially broad impacts of shaking, ground deformation, landslides, liquefaction, and even tsunamis, are extremely difficult to quantify without accelerated access to data, modeling, and analytics. This web-based application, deemed the "Earthquake Incident Journal", provides real-time access to authoritative and event-specific data from external (e.g. US Geological Survey, NASA, state and local governments, etc.) and internal (FEMA) data sources. The journal includes a GIS-based model for exposure analytics, allowing FEMA to assess the severity of an event, estimate impacts to structures and population in near real-time, and then apply planning factors to exposure estimates to answer questions such as: What geographic areas are impacted? Will federal support be needed? What resources are needed to support survivors? And which infrastructure elements or essential facilities are threatened? This presentation reviews the development of the Earthquake Incident Journal, detailing the data integration solutions, the methodology behind the GIS-based automated exposure model, and the planning factors as well as other analytical advances that

  13. Performances and recent evolutions of EMSC Real Time Information services (United States)

    Mazet-Roux, G.; Godey, S.; Bossu, R.


    The EMSC ( operates Real Time Earthquake Information services for the public and the scientific community which aim at providing rapid and reliable information on the seismic-ity of the Euro-Mediterranean region and on significant earthquakes worldwide. These services are based on parametric data rapidly provided by 66 seismological networks which are automatically merged and processed at EMSC. A web page which is updated every minute displays a list and a map of the latest earthquakes as well as additional information like location maps, moment tensors solutions or past regional seismicity. Since 2004, the performances and the popularity of these services have dramatically increased. The number of messages received from the contributors and the number of published events have been multiplied by 2 since 2004 and by 1.6 since 2005 respectively. The web traffic and the numbers of users of the Earthquake Notification Service (ENS) have been multiplied by 15 and 7 respectively. In terms of performances of the ENS, the median dissemination time for Euro-Med events is minutes in 2008. In order to further improve its performances and especially the speed and robustness of the reception of real time data, EMSC has recently implemented a software named QWIDS (Quake Watch Information Distribution System) which provides a quick and robust data exchange system through permanent TCP connections. At the difference with emails that can sometimes be delayed or lost, QWIDS is an actual real time communication system that ensures the data delivery. In terms of hardware, EMSC imple-mented a high availability, dynamic load balancing, redundant and scalable web servers infrastructure, composed of two SUN T2000 and one F5 BIG-IP switch. This will allow coping with constantly increas-ing web traffic and the occurrence of huge peaks of traffic after widely felt earthquakes.

  14. Latin American contributions to the GEM’s Earthquake Consequences Database


    Cardona Arboleda, Omar Dario; Ordaz Schroeder, Mario Gustavo; Salgado Gálvez, Mario Andrés; Carreño Tibaduiza, Martha Liliana; Barbat Barbat, Horia Alejandro


    One of the projects of the Global Earthquake Model (GEM) was to develop a global earthquake consequences database (GEMECD) which served both to be an open and public repository of damages and losses on different types of elements at global level and also as a benchmark for the development of vulnerability models that could capture specific characteristics of the affected countries. The online earthquakes consequences database has information on 71 events where 14 correspond to events that occ...

  15. Introduction: seismology and earthquake engineering in Mexico and Central and South America. (United States)

    Espinosa, A.F.


    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  16. Estimating shaking-induced casualties and building damage for global earthquake events: a proposed modelling approach (United States)

    So, Emily; Spence, Robin


    Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.

  17. <> earthquakes: a growing contribution to the Catalogue of Strong Italian Earthquakes

    Directory of Open Access Journals (Sweden)

    E. Guidoboni


    Full Text Available The particular structure of the research into historical seismology found in this catalogue has allowed a lot of information about unknown seismic events to be traced. This new contribution to seismologic knowledge mainly consists in: i the retrieval and organisation within a coherent framework of documentary evidence of earthquakes that took place between the Middle Ages and the sixteenth century; ii the improved knowledge of seismic events, even destructive events, which in the past had been "obscured" by large earthquakes; iii the identification of earthquakes in "silent" seismic areas. The complex elements to be taken into account when dealing with unknown seismic events have been outlined; much "new" information often falls into one of the following categories: simple chronological errors relative to other well-known events; descriptions of other natural phenomena, though defined in texts as "earthquakes" (landslides, hurricanes, tornadoes, etc.; unknown tremors belonging to known seismic periods; tremors that may be connected with events which have been catalogued under incorrect dates and with very approximate estimates of location and intensity. This proves that this was not a real seismic "silence" but a research vacuum.

  18. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt


    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  19. Earthquake Loss Scenarios: Warnings about the Extent of Disasters (United States)

    Wyss, M.; Tolis, S.; Rosset, P.


    It is imperative that losses expected due to future earthquakes be estimated. Officials and the public need to be aware of what disaster is likely in store for them in order to reduce the fatalities and efficiently help the injured. Scenarios for earthquake parameters can be constructed to a reasonable accuracy in highly active earthquake belts, based on knowledge of seismotectonics and history. Because of the inherent uncertainties of loss estimates however, it would be desirable that more than one group calculate an estimate for the same area. By discussing these estimates, one may find a consensus of the range of the potential disasters and persuade officials and residents of the reality of the earthquake threat. To model a scenario and estimate earthquake losses requires data sets that are sufficiently accurate of the number of people present, the built environment, and if possible the transmission of seismic waves. As examples we use loss estimates for possible repeats of historic earthquakes in Greece that occurred between -464 and 700. We model future large Greek earthquakes as having M6.8 and rupture lengths of 60 km. In four locations where historic earthquakes with serious losses have occurred, we estimate that 1,000 to 1,500 people might perish, with an additional factor of four people injured. Defining the area of influence of these earthquakes as that with shaking intensities larger and equal to V, we estimate that 1.0 to 2.2 million people in about 2,000 settlements may be affected. We calibrate the QLARM tool for calculating intensities and losses in Greece, using the M6, 1999 Athens earthquake and matching the isoseismal information for six earthquakes, which occurred in Greece during the last 140 years. Comparing fatality numbers that would occur theoretically today with the numbers reported, and correcting for the increase in population, we estimate that the improvement of the building stock has reduced the mortality and injury rate in Greek

  20. A mixed-methods study on perceptions towards use of Rapid Ethical Assessment to improve informed consent processes for health research in a low-income setting. (United States)

    Addissie, Adamu; Davey, Gail; Newport, Melanie J; Addissie, Thomas; MacGregor, Hayley; Feleke, Yeweyenhareg; Farsides, Bobbie


    Rapid Ethical Assessment (REA) is a form of rapid ethnographic assessment conducted at the beginning of research project to guide the consent process with the objective of reconciling universal ethical guidance with specific research contexts. The current study is conducted to assess the perceived relevance of introducing REA as a mainstream tool in Ethiopia. Mixed methods research using a sequential explanatory approach was conducted from July to September 2012, including 241 cross-sectional, self-administered and 19 qualitative, in-depth interviews among health researchers and regulators including ethics committee members in Ethiopian health research institutions and universities. In their evaluation of the consent process, only 40.2% thought that the consent process and information given were adequately understood by study participants; 84.6% claimed they were not satisfied with the current consent process and 85.5% thought the best interests of study participants were not adequately considered. Commonly mentioned consent-related problems included lack of clarity (48.1%), inadequate information (34%), language barriers (28.2%), cultural differences (27.4%), undue expectations (26.6%) and power imbalances (20.7%). About 95.4% believed that consent should be contextualized to the study setting and 39.4% thought REA would be an appropriate approach to improve the perceived problems. Qualitative findings helped to further explore the gaps identified in the quantitative findings and to map-out concerns related to the current research consent process in Ethiopia. Suggestions included, conducting REA during the pre-test (pilot) phase of studies when applicable. The need for clear guidance for researchers on issues such as when and how to apply the REA tools was stressed. The study findings clearly indicated that there are perceived to be correctable gaps in the consent process of medical research in Ethiopia. REA is considered relevant by researchers and stakeholders

  1. Near Real-Time Georeference of Umanned Aerial Vehicle Images for Post-Earthquake Response (United States)

    Wang, S.; Wang, X.; Dou, A.; Yuan, X.; Ding, L.; Ding, X.


    The rapid collection of Unmanned Aerial Vehicle (UAV) remote sensing images plays an important role in the fast submitting disaster information and the monitored serious damaged objects after the earthquake. However, for hundreds of UAV images collected in one flight sortie, the traditional data processing methods are image stitching and three-dimensional reconstruction, which take one to several hours, and affect the speed of disaster response. If the manual searching method is employed, we will spend much more time to select the images and the find images do not have spatial reference. Therefore, a near-real-time rapid georeference method for UAV remote sensing disaster data is proposed in this paper. The UAV images are achieved georeference combined with the position and attitude data collected by UAV flight control system, and the georeferenced data is organized by means of world file which is developed by ESRI. The C # language is adopted to compile the UAV images rapid georeference software, combined with Geospatial Data Abstraction Library (GDAL). The result shows that it can realize rapid georeference of remote sensing disaster images for up to one thousand UAV images within one minute, and meets the demand of rapid disaster response, which is of great value in disaster emergency application.


    Directory of Open Access Journals (Sweden)

    S. Wang


    Full Text Available The rapid collection of Unmanned Aerial Vehicle (UAV remote sensing images plays an important role in the fast submitting disaster information and the monitored serious damaged objects after the earthquake. However, for hundreds of UAV images collected in one flight sortie, the traditional data processing methods are image stitching and three-dimensional reconstruction, which take one to several hours, and affect the speed of disaster response. If the manual searching method is employed, we will spend much more time to select the images and the find images do not have spatial reference. Therefore, a near-real-time rapid georeference method for UAV remote sensing disaster data is proposed in this paper. The UAV images are achieved georeference combined with the position and attitude data collected by UAV flight control system, and the georeferenced data is organized by means of world file which is developed by ESRI. The C # language is adopted to compile the UAV images rapid georeference software, combined with Geospatial Data Abstraction Library (GDAL. The result shows that it can realize rapid georeference of remote sensing disaster images for up to one thousand UAV images within one minute, and meets the demand of rapid disaster response, which is of great value in disaster emergency application.

  3. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake (United States)

    Jones, Lucile M.


    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  4. Key stakeholder perceptions about consent to participate in acute illness research: a rapid, systematic review to inform epi/pandemic research preparedness. (United States)

    Gobat, Nina H; Gal, Micaela; Francis, Nick A; Hood, Kerenza; Watkins, Angela; Turner, Jill; Moore, Ronald; Webb, Steve A R; Butler, Christopher C; Nichol, Alistair


    A rigorous research response is required to inform clinical and public health decision-making during an epi/pandemic. However, the ethical conduct of such research, which often involves critically ill patients, may be complicated by the diminished capacity to consent and an imperative to initiate trial therapies within short time frames. Alternative approaches to taking prospective informed consent may therefore be used. We aimed to rapidly review evidence on key stakeholder (patients, their proxy decision-makers, clinicians and regulators) views concerning the acceptability of various approaches for obtaining consent relevant to pandemic-related acute illness research. We conducted a rapid evidence review, using the Internet, database and hand-searching for English language empirical publications from 1996 to 2014 on stakeholder opinions of consent models (prospective informed, third-party, deferred, or waived) used in acute illness research. We excluded research on consent to treatment, screening, or other such procedures, non-emergency research and secondary studies. Papers were categorised, and data summarised using narrative synthesis. We screened 689 citations, reviewed 104 full-text articles and included 52. Just one paper related specifically to pandemic research. In other emergency research contexts potential research participants, clinicians and research staff found third-party, deferred, and waived consent to be acceptable as a means to feasibly conduct such research. Acceptability to potential participants was motivated by altruism, trust in the medical community, and perceived value in medical research and decreased as the perceived risks associated with participation increased. Discrepancies were observed in the acceptability of the concept and application or experience of alternative consent models. Patients accepted clinicians acting as proxy-decision makers, with preference for two decision makers as invasiveness of interventions increased

  5. The SCEC/USGS dynamic earthquake rupture code verification exercise (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.


    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  6. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.


    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  7. Earthquakes Threaten Many American Schools (United States)

    Bailey, Nancy E.


    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  8. Make an Earthquake: Ground Shaking! (United States)

    Savasci, Funda


    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  9. Earthquake Catalogue of the Caucasus (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.


    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  10. Seismic resistance of equipment and building service systems: review of earthquake damage design requirements, and research applications in the USA

    International Nuclear Information System (INIS)

    Skjei, R.E.; Chakravartula, B.C.; Yanev, P.I.


    The history of earthquake damage and the resulting code design requirements for earthquake hazard mitigation for equipment in the USA is reviewed. Earthquake damage to essential service systems is summarized; observations for the 1964 Alaska and the 1971 San Fernando, California, earthquakes are stressed, and information from other events is included. USA building codes that reflect lessons learned from these earthquakes are discussed; brief summaries of widely used codes are presented. In conclusion there is a discussion of the desirability of adapting advanced technological concepts from the nuclear industry to equipment in conventional structures. (author)

  11. The Quake-Catcher Network: Improving Earthquake Strong Motion Observations Through Community Engagement (United States)

    Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.; Chung, A. I.; Neighbors, C.; Saltzman, J.


    The Quake-Catcher Network (QCN) involves the community in strong motion data collection by utilizing volunteer computing techniques and low-cost MEMS accelerometers. Volunteer computing provides a mechanism to expand strong-motion seismology with minimal infrastructure costs, while promoting community participation in science. Micro-Electro-Mechanical Systems (MEMS) triaxial accelerometers can be attached to a desktop computer via USB and are internal to many laptops. Preliminary shake table tests show the MEMS accelerometers can record high-quality seismic data with instrument response similar to research-grade strong-motion sensors. QCN began distributing sensors and software to K-12 schools and the general public in April 2008 and has grown to roughly 1500 stations worldwide. We also recently tested whether sensors could be quickly deployed as part of a Rapid Aftershock Mobilization Program (RAMP) following the 2010 M8.8 Maule, Chile earthquake. Volunteers are recruited through media reports, web-based sensor request forms, as well as social networking sites. Using data collected to date, we examine whether a distributed sensing network can provide valuable seismic data for earthquake detection and characterization while promoting community participation in earthquake science. We utilize client-side triggering algorithms to determine when significant ground shaking occurs and this metadata is sent to the main QCN server. On average, trigger metadata are received within 1-10 seconds from the observation of a trigger; the larger data latencies are correlated with greater server-station distances. When triggers are detected, we determine if the triggers correlate to others in the network using spatial and temporal clustering of incoming trigger information. If a minimum number of triggers are detected then a QCN-event is declared and an initial earthquake location and magnitude is estimated. Initial analysis suggests that the estimated locations and magnitudes are

  12. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016 (United States)

    Detweiler, Shane; Pollitz, Fred


    The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting.

  13. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borah Peak Earthquake

    International Nuclear Information System (INIS)

    Guenzler, R.C.; Gorman, V.W.


    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 references, 16 figures, 2 tables

  14. Improving the extraction of crisis information in the context of flood, fire, and landslide rapid mapping using SAR and optical remote sensing data (United States)

    Martinis, Sandro; Clandillon, Stephen; Twele, André; Huber, Claire; Plank, Simon; Maxant, Jérôme; Cao, Wenxi; Caspard, Mathilde; May, Stéphane


    Optical and radar satellite remote sensing have proven to provide essential crisis information in case of natural disasters, humanitarian relief activities and civil security issues in a growing number of cases through mechanisms such as the Copernicus Emergency Management Service (EMS) of the European Commission or the International Charter 'Space and Major Disasters'. The aforementioned programs and initiatives make use of satellite-based rapid mapping services aimed at delivering reliable and accurate crisis information after natural hazards. Although these services are increasingly operational, they need to be continuously updated and improved through research and development (R&D) activities. The principal objective of ASAPTERRA (Advancing SAR and Optical Methods for Rapid Mapping), the ESA-funded R&D project being described here, is to improve, automate and, hence, speed-up geo-information extraction procedures in the context of natural hazards response. This is performed through the development, implementation, testing and validation of novel image processing methods using optical and Synthetic Aperture Radar (SAR) data. The methods are mainly developed based on data of the German radar satellites TerraSAR-X and TanDEM-X, the French satellite missions Pléiades-1A/1B as well as the ESA missions Sentinel-1/2 with the aim to better characterize the potential and limitations of these sensors and their synergy. The resulting algorithms and techniques are evaluated in real case applications during rapid mapping activities. The project is focussed on three types of natural hazards: floods, landslides and fires. Within this presentation an overview of the main methodological developments in each topic is given and demonstrated in selected test areas. The following developments are presented in the context of flood mapping: a fully automated Sentinel-1 based processing chain for detecting open flood surfaces, a method for the improved detection of flooded vegetation

  15. Assessment of earthquake-induced tsunami hazard at a power plant site

    International Nuclear Information System (INIS)

    Ghosh, A.K.


    This paper presents a study of the tsunami hazard due to submarine earthquakes at a power plant site on the east coast of India. The paper considers various sources of earthquakes from the tectonic information, and records of past earthquakes and tsunamis. Magnitude-frequency relationship for earthquake occurrence rate and a simplified model for tsunami run-up height as a function of earthquake magnitude and the distance between the source and site have been developed. Finally, considering equal likelihood of generation of earthquakes anywhere on each of the faults, the tsunami hazard has been evaluated and presented as a relationship between tsunami height and its mean recurrence interval (MRI). Probability of exceedence of a certain wave height in a given period of time is also presented. These studies will be helpful in making an estimate of the tsunami-induced flooding potential at the site

  16. Overview of the critical disaster management challenges faced during Van 2011 earthquakes. (United States)

    Tolon, Mert; Yazgan, Ufuk; Ural, Derin N; Goss, Kay C


    On October 23, 2011, a M7.2 earthquake caused damage in a widespread area in the Van province located in eastern Turkey. This strong earthquake was followed by a M5.7 earthquake on November 9, 2011. This sequence of damaging earthquakes led to 644 fatalities. The management during and after these earthquake disaster imposed many critical challenges. In this article, an overview of these challenges is presented based on the observations by the authors in the aftermath of this disaster. This article presents the characteristics of 2011 Van earthquakes. Afterward, the key information related to the four main phases (ie, preparedness, mitigation, response, and recovery) of the disaster in Van is presented. The potential strategies that can be taken to improve the disaster management practice are identified, and a set of recommendations are proposed to improve the existing situation.

  17. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne


    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  18. Impacts of the 2010 Haitian earthquake in the diaspora: findings from Little Haiti, Miami, FL. (United States)

    Kobetz, Erin; Menard, Janelle; Kish, Jonathan; Bishop, Ian; Hazan, Gabrielle; Nicolas, Guerda


    In January 2010, a massive earthquake struck Haiti resulting in unprecedented damage. Little attention, however, has focused on the earthquake's mental health impact in the Haitian diaspora community. As part of an established community-based participatory research initiative in Little Haiti, the predominately Haitian neighborhood in Miami, FL, USA, community health workers conducted surveys with neighborhood residents about earthquake-related losses, coping strategies, and depressive/traumatic symptomology. Findings reveal the earthquake strongly impacted the diaspora community and highlights prominent coping strategies. Following the earthquake, only a small percentage of participants self-reported engaging in any negative health behaviors. Instead, a majority relied on their social networks for support. This study contributes to the discourse on designing culturally-responsive mental health initiatives for the Haitian diaspora and the ability of existing community-academic partnerships to rapidly adapt to community needs.

  19. Toward a comprehensive areal model of earthquake-induced landslides (United States)

    Miles, S.B.; Keefer, D.K.


    This paper provides a review of regional-scale modeling of earthquake-induced landslide hazard with respect to the needs for disaster risk reduction and sustainable development. Based on this review, it sets out important research themes and suggests computing with words (CW), a methodology that includes fuzzy logic systems, as a fruitful modeling methodology for addressing many of these research themes. A range of research, reviewed here, has been conducted applying CW to various aspects of earthquake-induced landslide hazard zonation, but none facilitate comprehensive modeling of all types of earthquake-induced landslides. A new comprehensive areal model of earthquake-induced landslides (CAMEL) is introduced here that was developed using fuzzy logic systems. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL is highly modifiable and adaptable; new knowledge can be easily added, while existing knowledge can be changed to better match local knowledge and conditions. As such, CAMEL should not be viewed as a complete alternative to other earthquake-induced landslide models. CAMEL provides an open framework for incorporating other models, such as Newmark's displacement method, together with previously incompatible empirical and local knowledge. ?? 2009 ASCE.

  20. Guidelines for nuclear plant response to an earthquake

    International Nuclear Information System (INIS)


    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. The objectives of the earthquake response procedures are to determine (1) the immediate effects of an earthquake on the physical condition of the nuclear power plant, (2) if shutdown of the plant is appropriate based on the observed damage to the plant or because the OBE has been exceeded, and (3) the readiness of the plant to resume operation following shutdown due to an earthquake. Readiness of a nuclear power plant to restart is determined on the basis of visual inspections of nuclear plant equipment and structures, and the successful completion of surveillance tests which demonstrate that the limiting conditions for operation as defined in the plant Technical Specifications are met. The guidelines are based on information obtained from a review of earthquake response procedures from numerous US and foreign nuclear power plants, interviews with nuclear plant operations personnel, and a review of reports of damage to industrial equipment and structures in actual earthquakes. 7 refs., 4 figs., 4 tabs

  1. Electrical streaming potential precursors to catastrophic earthquakes in China

    Directory of Open Access Journals (Sweden)

    F. Qian


    Full Text Available The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These transient events triggered by tidal forces exhibited a periodic nature and the statis