WorldWideScience

Sample records for rapid earthquake information

  1. A digital social network for rapid collection of earthquake disaster information

    Science.gov (United States)

    Xu, J. H.; Nie, G. Z.; Xu, X.

    2013-02-01

    Acquiring disaster information quickly after an earthquake is crucial for disaster and emergency rescue management. This study examines a digital social network - an earthquake disaster information reporting network - for rapid collection of earthquake disaster information. Based on the network, the disaster information rapid collection method is expounded in this paper. The structure and components of the reporting network are introduced. Then the work principles of the reporting network are discussed, in which the rapid collection of disaster information is realised by using Global System for Mobile Communications (GSM) messages to report the disaster information and Geographic information system (GIS) to analyse and extract useful disaster information. This study introduces some key technologies for the work principles, including the methods of mass sending and receiving of SMS for disaster management, the reporting network grouping management method, brief disaster information codes, and the GIS modelling of the reporting network. Finally, a city earthquake disaster information quick reporting system is developed and with the support of this system the reporting network obtained good results in a real earthquake and earthquake drills. This method is a semi-real time disaster information collection method which extends current SMS based method and meets the need of small and some moderate earthquakes.

  2. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  3. Twitter as Information Source for Rapid Damage Estimation after Major Earthquakes

    Science.gov (United States)

    Eggert, Silke; Fohringer, Joachim

    2014-05-01

    Natural disasters like earthquakes require a fast response from local authorities. Well trained rescue teams have to be available, equipment and technology has to be ready set up, information have to be directed to the right positions so the head quarter can manage the operation precisely. The main goal is to reach the most affected areas in a minimum of time. But even with the best preparation for these cases, there will always be the uncertainty of what really happened in the affected area. Modern geophysical sensor networks provide high quality data. These measurements, however, are only mapping disjoint values from their respective locations for a limited amount of parameters. Using observations of witnesses represents one approach to enhance measured values from sensors ("humans as sensors"). These observations are increasingly disseminated via social media platforms. These "social sensors" offer several advantages over common sensors, e.g. high mobility, high versatility of captured parameters as well as rapid distribution of information. Moreover, the amount of data offered by social media platforms is quite extensive. We analyze messages distributed via Twitter after major earthquakes to get rapid information on what eye-witnesses report from the epicentral area. We use this information to (a) quickly learn about damage and losses to support fast disaster response and to (b) densify geophysical networks in areas where there is sparse information to gain a more detailed insight on felt intensities. We present a case study from the Mw 7.1 Philippines (Bohol) earthquake that happened on Oct. 15 2013. We extract Twitter messages, so called tweets containing one or more specified keywords from the semantic field of "earthquake" and use them for further analysis. For the time frame of Oct. 15 to Oct 18 we get a data base of in total 50.000 tweets whereof 2900 tweets are geo-localized and 470 have a photo attached. Analyses for both national level and locally for

  4. PAGER--Rapid assessment of an earthquake?s impact

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  5. How citizen seismology is transforming rapid public earthquake information: the example of LastQuake smartphone application and Twitter QuakeBot

    Science.gov (United States)

    Bossu, R.; Etivant, C.; Roussel, F.; Mazet-Roux, G.; Steed, R.

    2014-12-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.

  6. Rapid telemetry and earthquake early warning

    Science.gov (United States)

    Allen, R.; Bose, M.; Brown, H.; Cua, G.; Given, D.; Hauksson, E.; Heaton, T.; Hellweg, M.; Jordan, T.; Kireev, A.; Maechling, P.; Neuhauser, D.; Oppenheimer, D.; Solanki, K.; Zeleznik, M.

    2008-05-01

    The California Integrated Seismic Network (CISN) is currently testing algorithms for earthquake early warning on the realtime seismic systems in the state. An earthquake warning system rapidly detects the initiation of earthquakes and assesses the associated hazard. The goal is to provide warning of potentially damaging ground motion in a target region prior to the arrival of seismic waves. The network-based approach to early warning requires station data to be gathered at a central site for joint processing. ElarmS, one network-based approach being tested, currently runs 15 sec behind realtime in order to gather ~90% of station data before processing. Even with this delay the recent Mw 5.4 Alum Rock earthquake near San Jose was detected and an accurate hazard assessment was available before ground shaking in San Francisco. The Virtual Seismologist (VS) method, another network-based approach, is a Bayesian method that incorporates information such as network topology, previously observed seismicity, and the Gutenberg-Richter relationship in magnitude and location estimation. The VS method is currently being transitioned from off-line to real-time testing and will soon be running 15 sec behind real-time, as in the case of ElarmS. We are also testing an on-site warning approach, which is based on single-station observations. On-site systems can deliver earthquake information faster than regional systems, and the warning could possibly reach potential users at much closer epicentral distances before the damaging shaking starts. By definition, on-site systems do not require a central processing facility or delivery of data from a distant seismic station, but they are less robust that networked-based systems and need a fast and reliable telemetry to deliver warnings to local users. The range of possible warning times is typically seconds to tens of seconds and every second of data latency translates into an equal reduction in the available warning time. Minimal latency

  7. Rapid Estimation of Tsunami Impact Following the Samoa Earthquake

    Science.gov (United States)

    Thio, H. K.; Polet, J.

    2009-12-01

    Rapid estimation of the tsunami waveheight after a large earthquake can significantly aid in disaster recovery efforts, planning of post-tsunami surveys and even early warning for more distant regions. We are exploring methods for refining these estimates by addressing variability due to uncertainties in the source parameters. After the Samoa earthquake, we used the solution from the near real-time Research CMT system at the National Earthquake Information Center to compute the tsunami wavefield. Given the close proximity to Samoa and American Samoa, details of the rupture geometry are very important for the character of the tsunami wavefield and we computed tsunami waveforms for several different geometries that are consistent with the rCMT solution. We will evaluate these results by comparing them with observed runups and explore ways to express the uncertainties in the simulated runup maps. We will also evaluate other real-time source estimates for use in rapid tsunami impact simulation.

  8. Seismogeodesy for rapid earthquake and tsunami characterization

    Science.gov (United States)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  9. Rapid Earthquake Magnitude Estimation for Early Warning Applications

    Science.gov (United States)

    Goldberg, Dara; Bock, Yehuda; Melgar, Diego

    2017-04-01

    Earthquake magnitude is a concise metric that provides invaluable information about the destructive potential of a seismic event. Rapid estimation of magnitude for earthquake and tsunami early warning purposes requires reliance on near-field instrumentation. For large magnitude events, ground motions can exceed the dynamic range of near-field broadband seismic instrumentation (clipping). Strong motion accelerometers are designed with low gains to better capture strong shaking. Estimating earthquake magnitude rapidly from near-source strong-motion data requires integration of acceleration waveforms to displacement. However, integration amplifies small errors, creating unphysical drift that must be eliminated with a high pass filter. The loss of the long period information due to filtering is an impediment to magnitude estimation in real-time; the relation between ground motion measured with strong-motion instrumentation and magnitude saturates, leading to underestimation of earthquake magnitude. Using station displacements from Global Navigation Satellite System (GNSS) observations, we can supplement the high frequency information recorded by traditional seismic systems with long-period observations to better inform rapid response. Unlike seismic-only instrumentation, ground motions measured with GNSS scale with magnitude without saturation [Crowell et al., 2013; Melgar et al., 2015]. We refine the current magnitude scaling relations using peak ground displacement (PGD) by adding a large GNSS dataset of earthquakes in Japan. Because it does not suffer from saturation, GNSS alone has significant advantages over seismic-only instrumentation for rapid magnitude estimation of large events. The earthquake's magnitude can be estimated within 2-3 minutes of earthquake onset time [Melgar et al., 2013]. We demonstrate that seismogeodesy, the optimal combination of GNSS and seismic data at collocated stations, provides the added benefit of improving the sensitivity of

  10. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Science.gov (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  11. Rapid Inventory of Earthquake Damage (RIED)

    NARCIS (Netherlands)

    Duque, Adriana; Hack, Robert; Montoya, L.; Scarpas, Tom; Slob, Siefko; Soeters, Rob; van Westen, Cees

    2001-01-01

    The 25 January 1999 Quindío earthquake in Colombia was a major disaster for the coffee-growing region in Colombia. Most of the damage occurred in the city of Armenia and surrounding villages. Damage due to earthquakes is strongly related to topographic and subsurface geotechnical conditions

  12. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  13. Rapid landslide risk assessment of transport infrastructure following the 13 November 2016 Kaikoura, New Zealand, earthquake

    Science.gov (United States)

    Robinson, Tom; Rosser, Nick

    2017-04-01

    Earthquake-generated landslides pose a significant risk to critical infrastructure, especially transport networks. For post-earthquake emergency response, identifying where landslides have affected transport networks is vital for understanding the ground access available to affected locations. However, post-earthquake landslide mapping is a difficult and time-consuming task, hindered by issues relating to the collection and processing of satellite images, cloud cover, and manual mapping. The development of rapid landslide modelling techniques for post-earthquake application can allow landslide hazard and risk to be modelled within hours of the earthquake occurring, leading to faster understanding of the likely losses to transport infrastructure. This study presents the results of efforts to rapidly model the extent of and losses related to landsliding following the 13 November 2016 Kaikoura earthquake in New Zealand. Using previously published data on landslide pre-disposing factors, the landslide hazard resulting from this earthquake was modelled in order to identify locations where landslides were most likely. This was combined with a simple horizon-scanning method along critical transport lines to identify zones in which landslides could potentially impact the networks. Landslide hazard in these zones was subsequently weighted by the reach angle to the respective network and averaged for the entire zone. The results show the relative risk of landslides impacting different sections of the transport networks and were derived within 48 hours of the earthquake occurring. These models rapidly and correctly highlighted the numerous blockages along the vital State Highway 1 link between Christchurch and Kaikoura, as well as those on the only alternative inland route. This demonstrates that accurate and rapid analysis of landslide losses can be undertaken immediately post-earthquake, with results potentially available within hours of the event, far sooner than current

  14. Rapid and robust characterization of the earthquake source for tsunami early-warning

    Science.gov (United States)

    Lomax, Anthony; Michelini, Alberto; Bernardi, Fabrizio; Lauciani, Valentino

    2015-04-01

    Effective tsunami early-warning after an earthquake is difficult when the distances and tsunami travel-times between earthquake/tsunami source regions and coast lines at risk are small, especially since the density of seismic and other monitoring stations is very low in most regions of risk. For tsunami warning worldwide, seismic monitoring and analysis currently provide the majority of information available within the first tens of minutes after an earthquake. This information is used for direct tsunami hazard assessment, and as basic input to real-time, tsunami hazard modeling. It is thus crucial that key earthquake parameters are determined as rapidly and reliably as possible, in a probabilistic, time-evolving manner, along with full uncertainties. Early-est (EArthquake Rapid Location sYstem with EStimation of Tsunamigenesis) is the module for rapid earthquake detection, location and analysis at the INGV tsunami alert center (CAT, "Centro di Allerta Tsunami"), part of the Italian, candidate Tsunami Watch Provider. Here we present the information produced by Early-est within the first 10 min after an earthquake to characterize the location, depth, magnitude, mechanism and tsunami potential of an earthquake. We discuss key algorithms in Early-est that produce fully automatic, robust results and their uncertainties in the shortest possible time using sparse observations. For example, a broadband picker and a robust, probabilistic, global-search detector/associator/locator component of Early-est can detect and locate a seismic event with as few as 4 to 5 P onset observations. We also discuss how these algorithms may be further evolved to provide even earlier and more robust results. Finally, we illustrate how the real-time, evolutionary and probabilistic earthquake information produced by Early-est, along with prior and non-seismic information and later seismic information (e.g., full-waveform moment-tensors), may be used within time-evolving, decision and modeling

  15. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  16. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  17. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2011-12-01

    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website (www.emsc-csem.org) is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  18. Rapid characterisation of large earthquakes by multiple seismic broadband arrays

    Directory of Open Access Journals (Sweden)

    D. Roessler

    2010-04-01

    Full Text Available An automatic procedure is presented to retrieve rupture parameters for large earthquakes along the Sunda arc subduction zone. The method is based on standard array analysis and broadband seismograms registered within 30°–100° epicentral distance. No assumptions on source mechanism are required. By means of semblance the coherency of P waveforms is analysed at separate large-aperture arrays. Waveforms are migrated to a 10°×10° wide source region to study the spatio-temporal evolution of earthquakes at each array. The multiplication of the semblance source maps resulting at each array increases resolution. Start, duration, extent, direction, and propagation velocity are obtained and published within 25 min after the onset of the event. First preliminary results can be obtained even within 16 min. Their rapid determination may improve the mitigation of the earthquake and tsunami hazard. Real-time application will provide rupture parameters to the GITEWS project (German Indonesian Tsunami Early Warning System. The method is applied to the two M8.0 Sumatra earthquakes on 12 September 2007, to the M7.4 Java earthquake on 2 September 2009, and to major subduction earthquakes that have occurred along Sumatra and Java since 2000. Obtained rupture parameters are most robust for the largest earthquakes with magnitudes M≥8. The results indicate that almost the entire seismogenic part of the subduction zone off the coast of Sumatra has been ruptured. Only the great Sumatra event in 2004 and the M7.7 Java event on 17 July 2006 could reach to or close to the surface at the trench. Otherwise, the rupturing was apparently confined to depths below 25 km. Major seismic gaps seem to remain off the coast of Padang and the southern tip of Sumatra.

  19. Mapping Natech risk due to earthquakes using RAPID-N

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2013-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management

  20. Rapid post-earthquake modelling of coseismic landslide intensity and distribution for emergency response decision support

    Directory of Open Access Journals (Sweden)

    T. R. Robinson

    2017-09-01

    Full Text Available Current methods to identify coseismic landslides immediately after an earthquake using optical imagery are too slow to effectively inform emergency response activities. Issues with cloud cover, data collection and processing, and manual landslide identification mean even the most rapid mapping exercises are often incomplete when the emergency response ends. In this study, we demonstrate how traditional empirical methods for modelling the total distribution and relative intensity (in terms of point density of coseismic landsliding can be successfully undertaken in the hours and days immediately after an earthquake, allowing the results to effectively inform stakeholders during the response. The method uses fuzzy logic in a GIS (Geographic Information Systems to quickly assess and identify the location-specific relationships between predisposing factors and landslide occurrence during the earthquake, based on small initial samples of identified landslides. We show that this approach can accurately model both the spatial pattern and the number density of landsliding from the event based on just several hundred mapped landslides, provided they have sufficiently wide spatial coverage, improving upon previous methods. This suggests that systematic high-fidelity mapping of landslides following an earthquake is not necessary for informing rapid modelling attempts. Instead, mapping should focus on rapid sampling from the entire affected area to generate results that can inform the modelling. This method is therefore suited to conditions in which imagery is affected by partial cloud cover or in which the total number of landslides is so large that mapping requires significant time to complete. The method therefore has the potential to provide a quick assessment of landslide hazard after an earthquake and may therefore inform emergency operations more effectively compared to current practice.

  1. Rapid post-earthquake modelling of coseismic landslide intensity and distribution for emergency response decision support

    Science.gov (United States)

    Robinson, Tom R.; Rosser, Nicholas J.; Densmore, Alexander L.; Williams, Jack G.; Kincey, Mark E.; Benjamin, Jessica; Bell, Heather J. A.

    2017-09-01

    Current methods to identify coseismic landslides immediately after an earthquake using optical imagery are too slow to effectively inform emergency response activities. Issues with cloud cover, data collection and processing, and manual landslide identification mean even the most rapid mapping exercises are often incomplete when the emergency response ends. In this study, we demonstrate how traditional empirical methods for modelling the total distribution and relative intensity (in terms of point density) of coseismic landsliding can be successfully undertaken in the hours and days immediately after an earthquake, allowing the results to effectively inform stakeholders during the response. The method uses fuzzy logic in a GIS (Geographic Information Systems) to quickly assess and identify the location-specific relationships between predisposing factors and landslide occurrence during the earthquake, based on small initial samples of identified landslides. We show that this approach can accurately model both the spatial pattern and the number density of landsliding from the event based on just several hundred mapped landslides, provided they have sufficiently wide spatial coverage, improving upon previous methods. This suggests that systematic high-fidelity mapping of landslides following an earthquake is not necessary for informing rapid modelling attempts. Instead, mapping should focus on rapid sampling from the entire affected area to generate results that can inform the modelling. This method is therefore suited to conditions in which imagery is affected by partial cloud cover or in which the total number of landslides is so large that mapping requires significant time to complete. The method therefore has the potential to provide a quick assessment of landslide hazard after an earthquake and may therefore inform emergency operations more effectively compared to current practice.

  2. Rapid estimate of earthquake source duration: application to tsunami warning.

    Science.gov (United States)

    Reymond, Dominique; Jamelot, Anthony; Hyvernaud, Olivier

    2016-04-01

    We present a method for estimating the source duration of the fault rupture, based on the high-frequency envelop of teleseismic P-Waves, inspired from the original work of (Ni et al., 2005). The main interest of the knowledge of this seismic parameter is to detect abnormal low velocity ruptures that are the characteristic of the so called 'tsunami-earthquake' (Kanamori, 1972). The validation of the results of source duration estimated by this method are compared with two other independent methods : the estimated duration obtained by the Wphase inversion (Kanamori and Rivera, 2008, Duputel et al., 2012) and the duration calculated by the SCARDEC process that determines the source time function (M. Vallée et al., 2011). The estimated source duration is also confronted to the slowness discriminant defined by Newman and Okal, 1998), that is calculated routinely for all earthquakes detected by our tsunami warning process (named PDFM2, Preliminary Determination of Focal Mechanism, (Clément and Reymond, 2014)). Concerning the point of view of operational tsunami warning, the numerical simulations of tsunami are deeply dependent on the source estimation: better is the source estimation, better will be the tsunami forecast. The source duration is not directly injected in the numerical simulations of tsunami, because the cinematic of the source is presently totally ignored (Jamelot and Reymond, 2015). But in the case of a tsunami-earthquake that occurs in the shallower part of the subduction zone, we have to consider a source in a medium of low rigidity modulus; consequently, for a given seismic moment, the source dimensions will be decreased while the slip distribution increased, like a 'compact' source (Okal, Hébert, 2007). Inversely, a rapid 'snappy' earthquake that has a poor tsunami excitation power, will be characterized by higher rigidity modulus, and will produce weaker displacement and lesser source dimensions than 'normal' earthquake. References: CLément, J

  3. Rapid Assessment of Earthquakes with Radar and Optical Geodetic Imaging and Finite Fault Models (Invited)

    Science.gov (United States)

    Fielding, E. J.; Sladen, A.; Simons, M.; Rosen, P. A.; Yun, S.; Li, Z.; Avouac, J.; Leprince, S.

    2010-12-01

    Earthquake responders need to know where the earthquake has caused damage and what is the likely intensity of damage. The earliest information comes from global and regional seismic networks, which provide the magnitude and locations of the main earthquake hypocenter and moment tensor centroid and also the locations of aftershocks. Location accuracy depends on the availability of seismic data close to the earthquake source. Finite fault models of the earthquake slip can be derived from analysis of seismic waveforms alone, but the results can have large errors in the location of the fault ruptures and spatial distribution of slip, which are critical for estimating the distribution of shaking and damage. Geodetic measurements of ground displacements with GPS, LiDAR, or radar and optical imagery provide key spatial constraints on the location of the fault ruptures and distribution of slip. Here we describe the analysis of interferometric synthetic aperture radar (InSAR) and sub-pixel correlation (or pixel offset tracking) of radar and optical imagery to measure ground coseismic displacements for recent large earthquakes, and lessons learned for rapid assessment of future events. These geodetic imaging techniques have been applied to the 2010 Leogane, Haiti; 2010 Maule, Chile; 2010 Baja California, Mexico; 2008 Wenchuan, China; 2007 Tocopilla, Chile; 2007 Pisco, Peru; 2005 Kashmir; and 2003 Bam, Iran earthquakes, using data from ESA Envisat ASAR, JAXA ALOS PALSAR, NASA Terra ASTER and CNES SPOT5 satellite instruments and the NASA/JPL UAVSAR airborne system. For these events, the geodetic data provided unique information on the location of the fault or faults that ruptured and the distribution of slip that was not available from the seismic data and allowed the creation of accurate finite fault source models. In many of these cases, the fault ruptures were on previously unknown faults or faults not believed to be at high risk of earthquakes, so the area and degree of

  4. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    Science.gov (United States)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of

  5. An Efficient Rapid Warning System For Earthquakes In The European-mediterranean Region

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; di Giovambattista, R.; Tome, M.

    Every year a few damaging earthquakes occur in the European-Mediterranean region. It is therefore indispensable to operate a real-time warning system in order to pro- vide rapidly reliable estimates of the location, depth and magnitude of these seismic events. In order to provide this information in a timely manner both to the scientific community and to the European and national authorities dealing with natural hazards and relief organisation, the European-Mediterranean Seismological Centre (EMSC) has federated a network of seismic networks exchanging their data in quasi real-time. Today, thanks to the Internet, the EMSC receives real-time information about earth- quakes from about thirty seismological institutes. As soon as data reach the EMSC, they are displayed on the EMSC Web pages (www.emsc-csem.org). A seismic alert is generated for any potentially damaging earthquake in the European-Mediterranean re- gion, potentially damaging earthquakes being defined as seismic events of magnitude 5 or more. The warning system automatically issues a message to the duty seismolo- gist mobile phone and pager. The seismologist log in to the EMSC computers using a laptop PC and relocates the earthquake by processing together all information pro- vided by the networks. The new location and magnitude are then send, by fax, telex, and email, within one hour following the earthquake occurrence, to national and inter- national organisations whose activities are related to seismic risks, and to the EMSC members. The EMSC rapid warning system has been fully operational for more than 4 years. Its distributed architecture has proved to be an efficient and reliable way for the monitoring of potentially damaging earthquakes. Furthermore, if a major problem disrupts the operational system more than 30 minutes, the duty is taken, over either by the Instituto Geografico National in Spain or by the Istituto Nazionale di Geofisica in Italy. The EMSC operational centre, located at the

  6. Rapid Source Characterization of the 2011 Mw 9.0 off the Pacific coast of Tohoku Earthquake

    Science.gov (United States)

    Hayes, Gavin P.

    2011-01-01

    On March 11th, 2011, a moment magnitude 9.0 earthquake struck off the coast of northeast Honshu, Japan, generating what may well turn out to be the most costly natural disaster ever. In the hours following the event, the U.S. Geological Survey National Earthquake Information Center led a rapid response to characterize the earthquake in terms of its location, size, faulting source, shaking and slip distributions, and population exposure, in order to place the disaster in a framework necessary for timely humanitarian response. As part of this effort, fast finite-fault inversions using globally distributed body- and surface-wave data were used to estimate the slip distribution of the earthquake rupture. Models generated within 7 hours of the earthquake origin time indicated that the event ruptured a fault up to 300 km long, roughly centered on the earthquake hypocenter, and involved peak slips of 20 m or more. Updates since this preliminary solution improve the details of this inversion solution and thus our understanding of the rupture process. However, significant observations such as the up-dip nature of rupture propagation and the along-strike length of faulting did not significantly change, demonstrating the usefulness of rapid source characterization for understanding the first order characteristics of major earthquakes.

  7. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    Science.gov (United States)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  8. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Science.gov (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  9. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    Science.gov (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  10. Integrated detection and analysis of earthquake disaster information using airborne data

    Directory of Open Access Journals (Sweden)

    Chunxiang Cao

    2016-05-01

    Full Text Available The primary goal of this paper is to discuss an integrated approach to efficiently obtaining earthquake damage information. We developed a framework to rapidly obtain earthquake damage information using post-earthquake airborne optical images. The framework is a standard process that includes data selection, preprocessing, damage factor identification, damage factor evaluation and the development of an earthquake damage information map. We can obtain damage information on severely affected regions using this framework, which will aid in planning rescue and rehabilitation efforts following disasters. We used the integrated approach to obtain damage information using the Lushan earthquake (magnitude 7.0, 20 April 2013 as a case study. The result were as follows: (1 644 collapsed buildings and 4599 damaged buildings accounted for 13.90% and 96.24%, respectively, of the total number of buildings in the study area; (2 334 landslides (total area of 691,674.5 m2 were detected and were found at greater probabilities at elevations of 1400–1500 m and higher slope; (3 no secondary disasters, such as barrier lakes, were detected; (4 15 damaged sections (total of 306 m were detected in the lifelines, and road sections that are at a high risk of damage (total of 2.4 km were identified; and (5 key structures, including Yuxi River Dam and three bridges, were intact. Integrating the earthquake damage factor information generated a comprehensive Lushan earthquake damage information map. The integrated approach was proven to be effective using the Lushan earthquake as a case study and can be applied to assess earthquake damage to facilitate efficient rescue efforts.

  11. Automated rapid finite fault inversion for megathrust earthquakes: Application to the Maule (2010), Iquique (2014) and Illapel (2015) great earthquakes

    Science.gov (United States)

    Benavente, Roberto; Cummins, Phil; Dettmer, Jan

    2016-04-01

    Rapid estimation of the spatial and temporal rupture characteristics of large megathrust earthquakes by finite fault inversion is important for disaster mitigation. For example, estimates of the spatio-temporal evolution of rupture can be used to evaluate population exposure to tsunami waves and ground shaking soon after the event by providing more accurate predictions than possible with point source approximations. In addition, rapid inversion results can reveal seismic source complexity to guide additional, more detailed subsequent studies. This work develops a method to rapidly estimate the slip distribution of megathrust events while reducing subjective parameter choices by automation. The method is simple yet robust and we show that it provides excellent preliminary rupture models as soon as 30 minutes for three great earthquakes in the South-American subduction zone. This may slightly change for other regions depending on seismic station coverage but method can be applied to any subduction region. The inversion is based on W-phase data since it is rapidly and widely available and of low amplitude which avoids clipping at close stations for large events. In addition, prior knowledge of the slab geometry (e.g. SLAB 1.0) is applied and rapid W-phase point source information (time delay and centroid location) is used to constrain the fault geometry and extent. Since the linearization by multiple time window (MTW) parametrization requires regularization, objective smoothing is achieved by the discrepancy principle in two fully automated steps. First, the residuals are estimated assuming unknown noise levels, and second, seeking a subsequent solution which fits the data to noise level. The MTW scheme is applied with positivity constraints and a solution is obtained by an efficient non-negative least squares solver. Systematic application of the algorithm to the Maule (2010), Iquique (2014) and Illapel (2015) events illustrates that rapid finite fault inversion with

  12. Resource-Constrained Information Management: Providing Governments with Information for Earthquake Preparedness

    Science.gov (United States)

    Vatenmacher, Michael; Isaac, Shabtai; Svoray, Tal

    2017-05-01

    This study seeks to attain a better understanding of the information that is required by governments to prepare for earthquakes, and of the constraints they face in obtaining this information. The contributions of the study are two-fold. A survey that was conducted among those responsible for earthquake preparedness actions in different governmental agencies and at different levels revealed on the one hand a desire for information on a broad range of topics, but on the other hand that no resources were allocated in practice to gather this information. A Geographic Information System-based process that was developed following the survey, allowed the required information on seismic hazards and loss and damage risks to be rapidly collected, mapped and integrated. This supported the identification of high-priority areas, for which a more detailed analysis could be initiated. An implementation of the process showed promise, and confirmed its feasibility. Its relative simplicity may ensure that an earthquake preparedness process is initiated by governments that are otherwise reluctant to allocate resources for this purpose.

  13. Thumbnail‐based questionnaires for the rapid and efficient collection of macroseismic data from global earthquakes

    Science.gov (United States)

    Bossu, Remy; Landes, Matthieu; Roussel, Frederic; Steed, Robert; Mazet-Roux, Gilles; Martin, Stacey S.; Hough, Susan E.

    2017-01-01

    The collection of earthquake testimonies (i.e., qualitative descriptions of felt shaking) is essential for macroseismic studies (i.e., studies gathering information on how strongly an earthquake was felt in different places), and when done rapidly and systematically, improves situational awareness and in turn can contribute to efficient emergency response. In this study, we present advances made in the collection of testimonies following earthquakes around the world using a thumbnail‐based questionnaire implemented on the European‐Mediterranean Seismological Centre (EMSC) smartphone app and its website compatible for mobile devices. In both instances, the questionnaire consists of a selection of thumbnails, each representing an intensity level of the European Macroseismic Scale 1998. We find that testimonies are collected faster, and in larger numbers, by way of thumbnail‐based questionnaires than by more traditional online questionnaires. Responses were received from all seismically active regions of our planet, suggesting that thumbnails overcome language barriers. We also observed that the app is not sufficient on its own, because the websites are the main source of testimonies when an earthquake strikes a region for the first time in a while; it is only for subsequent shocks that the app is widely used. Notably though, the speed of the collection of testimonies increases significantly when the app is used. We find that automated EMSC intensities as assigned by user‐specified thumbnails are, on average, well correlated with “Did You Feel It?” (DYFI) responses and with the three independently and manually derived macroseismic datasets, but there is a tendency for EMSC to be biased low with respect to DYFI at moderate and large intensities. We address this by proposing a simple adjustment that will be verified in future earthquakes.

  14. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    Science.gov (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  15. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    Science.gov (United States)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-03-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.

  16. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China.

    Science.gov (United States)

    Liu, Xu; Tang, Bihan; Yang, Hongyang; Liu, Yuan; Xue, Chen; Zhang, Lulu

    2015-12-04

    Performance assessments of earthquake medical rapid response teams (EMRRTs), particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  17. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China

    Directory of Open Access Journals (Sweden)

    Xu Liu

    2015-12-01

    Full Text Available Purpose: Performance assessments of earthquake medical rapid response teams (EMRRTs, particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Methods: Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. Results: The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. Conclusions: This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  18. EDIM - An Earthquake Disaster Information System for the Marmara Region, Turkey

    Science.gov (United States)

    Köhler, N.; Wenzel, F.; Erdik, M. O.; Zschau, J.; Milkereit, C.; Picozzi, M.; Fischer, J.; Redlich, J.; Kühnlenz, F.; Lichtblau, B.; Eveslage, I.; Christ, I.; Lessing, R.; Kiehle, C.

    2009-12-01

    The research project EDIM (Earthquake Disaster Information system for the Marmara region), as part of the GEOTECHNOLOGIEN programme of the German Federal Ministry for Education and Research, is a consortium of German research and commercial organizations together with the Turkish partner KOERI - Kandilli Observatory and Earthquake Research Institute of the Bogazici University in Istanbul. The main objectives of EDIM are to enhance the existing Istanbul earthquake early warning system with a number of scientific and technological developments that - in the end - provide a tool set for earthquake early warning with wide applicability. Innovations focus on three areas: (1) Analysis and options for improvement of the current Istanbul early warning system. (2) Development of a self-organizing seismic early warning information network (SOSEWIN). This is a decentralized, wireless mesh sensor network of low-cost components supporting earthquake early warning and rapid response tasks. (3) Development of a geoinformation infrastructure and geoinformation system tuned to earthquake early warning purposes. This includes rapid damage estimates, visualization of earthquake information and damage estimates, and the access to information and real-time sensor data. Development in the frame of the Istanbul early warning system, set up and operated by KOERI, allows testing our novel methods and techniques in an operational system environment. The integration of strong motion seismology, sensor system hard- and software development, and geoinformation real-time management tools prove a successful concept in making seismic early warning a novel technology with high potential for scientific and technological innovation, disaster mitigation, and many spin-offs for other fields. EDIM can serve as a model for further developments in the field of early warning on a global scale.

  19. Rapid decision tool to predict earthquake destruction in Sumatra by using first motion study

    Science.gov (United States)

    Bhakta, Shardul Sanjay

    The main idea of this project is to build an interactive and smart Geographic Information system tool which can help predict intensity of real time earthquakes in Sumatra Island of Indonesia. The tool has an underlying intelligence to predict the intensity of an earthquake depending on analysis of similar earthquakes in the past in that specific region. Whenever an earthquake takes place in Sumatra, a First Motion Study is conducted; this decides its type, depth, latitude and longitude. When the user inputs this information into the input string, the tool will try to find similar earthquakes with a similar First Motion Survey and depth. It will do a survey of similar earthquakes and predict if this real time earthquake can be disastrous or not. This tool has been developed in JAVA. I have used MOJO (Map Objects JAVA Objects) to show map of Indonesia and earthquake locations in the form of points. ESRI has created MOJO which is a set of JAVA API's. The Indonesia map, earthquake location points and its co-relation was all designed using MOJO. MOJO is a powerful tool which made it easy to design the tool. This tool is easy to use and the user has to input only a few parameters for the end result. I hope this tool justifies its use in prediction of earthquakes and help save lives in Sumatra.

  20. LastQuake: a comprehensive strategy for rapid engagement of earthquake eyewitnesses, massive crowdsourcing and risk reduction

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Steed, R.; Frobert, L.

    2015-12-01

    LastQuake is a smartphone app, browser add-on and the most sophisticated Twitter robot (quakebot) for earthquakes currently in operation. It fulfills eyewitnesses' needs by offering information on felt earthquakes and their effects within tens of seconds of their occurrence. Associated with an active presence on Facebook, Pinterest and on websites, this proves a very efficient engagement strategy. For example, the app was installed thousands of times after the Ghorka earthquake in Nepal. Language barriers have been erased by using visual communication; for example, felt reports are collected through a set of cartoons representing different shaking levels. Within 3 weeks of the magnitude 7.8 Ghorka earthquakes, 7,000 felt reports with thousands of comments were collected related to the mainshock and tens of its aftershocks as well as 100 informative geo-located pics. The QuakeBot was essential in allowing us to be identified so well and interact with those affected. LastQuake is also a risk reduction tool since it provides rapid information. Rapid information is similar to prevention since when it does not exist, disasters can happen. When no information is available after a felt earthquake, the public block emergency lines by trying to find out the cause of the shaking, crowds form potentially leading to unpredictable crowd movement, rumors spread. In its next release LastQuake will also provide people with guidance immediately after a shaking through a number of pop-up cartoons illustrating "do/don't do" items (go to open places, do not phone emergency services except if people are injured…). LastQuake's app design is simple and intuitive and has a global audience. It benefited from a crowdfunding campaign (and the support of the Fondation MAIF) and more improvements have been planned after an online feedback campaign organized in early June with the Ghorka earthquake eyewitnesses. LastQuake is also a seismic risk reduction tools thanks to its very rapid

  1. Earthquakes in British Columbia

    National Research Council Canada - National Science Library

    1991-01-01

    This pamphlet provides information about the causes of earthquakes, where earthquakes occur, British Columbia plate techtonics, earthquake patterns, earthquake intensity, geology and earthquake impact...

  2. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  3. Rapid earthquake detection through GPU-Based template matching

    Science.gov (United States)

    Mu, Dawei; Lee, En-Jui; Chen, Po

    2017-12-01

    The template-matching algorithm (TMA) has been widely adopted for improving the reliability of earthquake detection. The TMA is based on calculating the normalized cross-correlation coefficient (NCC) between a collection of selected template waveforms and the continuous waveform recordings of seismic instruments. In realistic applications, the computational cost of the TMA is much higher than that of traditional techniques. In this study, we provide an analysis of the TMA and show how the GPU architecture provides an almost ideal environment for accelerating the TMA and NCC-based pattern recognition algorithms in general. So far, our best-performing GPU code has achieved a speedup factor of more than 800 with respect to a common sequential CPU code. We demonstrate the performance of our GPU code using seismic waveform recordings from the ML 6.6 Meinong earthquake sequence in Taiwan.

  4. Rapid response seismic networks in Europe: lessons learnt from the L'Aquila earthquake emergency

    Directory of Open Access Journals (Sweden)

    Angelo Strollo

    2011-08-01

    Full Text Available

    The largest dataset ever recorded during a normal fault seismic sequence was acquired during the 2009 seismic emergency triggered by the damaging earthquake in L'Aquila (Italy. This was possible through the coordination of different rapid-response seismic networks in Italy, France and Germany. A seismic network of more than 60 stations recorded up to 70,000 earthquakes. Here, we describe the different open-data archives where it is possible to find this unique set of data for studies related to hazard, seismotectonics and earthquake physics. Moreover, we briefly describe some immediate and direct applications of emergency seismic networks. At the same time, we note the absence of communication platforms between the different European networks. Rapid-response networks need to agree on common strategies for network operations. Hopefully, over the next few years, the European Rapid-Response Seismic Network will became a reality.

  5. U.S. Tsunami Information technology (TIM) Modernization: Performance Assessment of Tsunamigenic Earthquake Discrimination System

    Science.gov (United States)

    Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.

    2015-12-01

    Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.

  6. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Jüngling, Sebastian; Schroeder, Matthias; Lühr, Birger-Gottfried; Woith, Heiko; Wächter, Joachim

    2016-04-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related event parameters after the occurrence of extreme events and make them available for science and public as quick as possible. However, the overall objective of this research is to reduce geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical and current events, a comprehensive visualization is necessary. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly interactive web mapping application could be realized. Further, this web service tool integrates historical and current earthquake information from the USGS earthquake database NEIC, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and others. This compilation of data sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are retrievable too. Another special feature in the application is the limitation of time spans via a time shifting tool. Users can interactively vary the visualization by moving the time slider. In addition, the events can be narrowed down based on the magnitude, the wave height of tsunamis or the volcanic explosion index. Furthermore, the use of the latest JavaScript libraries makes it possible to display the application on all screen sizes and devices. With this application, information on current and historical earthquakes and other extreme events can be obtained based on the spatio-temporal context, such as the concomitant visualization of seismicity of a particular region.

  7. Scientific Information Platform for the 2008 Great Wenchuan Earthquake

    Science.gov (United States)

    Liang, C.

    2012-12-01

    The 2008 MS 8.0 Wenchuan earthquake is one of the deadliest in recent human history. This earthquake has not just united the whole world to help local people to lead their life through the difficult time, it has also fostered significant global cooperation to study this event from various aspects: including pre-seismic events (such as the seismicity, gravity, electro-magnetic fields, well water level, radon level in water etc), co-seismic events (fault slipping, landslides, man-made structure damages etc) and post-seismic events (such as aftershocks, well water level changing etc) as well as the disaster relief efforts. In the last four years, more than 300 scientific articles have been published on peer-reviewed journals, among them about 50% are published in Chinese, 30% in English, and about 20% in both languages. These researches have advanced our understanding of earthquake science in general. It has also sparked open debates in many aspects. Notably, the role of the Zipingpu reservoir (built not long ago before the earthquake) in the triggering of this monstrous earthquake is still one of many continuing debates. Given that all these articles are ssporadically spread out on different journals and numerous issues and in different languages, it can be very inefficient, sometimes impossible, to dig out the information that are in need. The Earthquake Research Group in the Chengdu University of Technology (ERGCDUT) has initiated an effort to develop an information platform to collect and analyze scientific research on or related to this earthquake, the hosting faults and the surrounding tectonic regions. A preliminary website has been setup for this purpose: http://www.wenchuaneqresearch.org. Up to this point (July 2012), articles published in 6 Chinese journals and 7 international journals have been collected. Articles are listed journal by journal, and also grouped by contents into four major categories, including pre-seismic events, co-seismic events, post

  8. Conceptualizing ¬the Abstractions of Earthquakes Through an Instructional Sequence Using SeisMac and the Rapid Earthquake Viewer

    Science.gov (United States)

    Taber, J.; Hubenthal, M.; Wysession, M.

    2007-12-01

    Newsworthy earthquakes provide an engaging hook for students in Earth science classes, particularly when discussing their effects on people and the landscape. However, engaging students in an analysis of earthquakes that extends beyond death and damage, is frequently hampered by the abstraction of recorded ground motion data in the form of raw seismograms and the inability of most students to personally relate to ground accelerations. To overcome these challenges, an educational sequence has been developed using two software tools: SeisMac by Daniel Griscom, and the Rapid Earthquake Viewer (REV) developed by the University of South Carolina in collaboration with IRIS and DLESE. This sequence presents a unique opportunity for Earth Science teachers to "create" foundational experiences for students as they construction a framework of understanding of abstract concepts. The first activity is designed to introduce the concept of a three-component seismogram and to directly address the very abstract nature of seismograms through a kinesthetic experience. Students first learn to take the pulse of their classroom through a guided exploration of SeisMac, which displays the output of the laptop's built-in Sudden Motion Sensor (a 3-component accelerometer). This exploration allows students to view a 3-component seismogram as they move or tap the laptop and encourages them to propose and carry out experiments to explain the meaning of the 3-component seismogram. Once completed students are then asked to apply this new knowledge to a real 3-component seismogram printed from REV. Next the activity guides students through the process of identifying P and S waves and using SeisMac to connect the physical motion of the laptop to the "wiggles" they see on the SeisMac display and then comparing those to the "wiggles" they see on their seismogram. At this point students are more fully prepared to engage in an S-P location exercise such as those included in many state standards

  9. Volunteered Geographic Information for Disaster Management with Application to Earthquake Disaster Databank & Sharing Platform

    Science.gov (United States)

    Chen, H.; Zhang, W. C.; Deng, C.; Nie, N.; Yi, L.

    2017-02-01

    All phases of disaster management require up-to-date and accurate information. Different in-situ and remote sensor systems help to monitor dynamic properties such as air quality, water level or inundated areas. The rapid emergence of web-based services has facilitated the collection, dissemination, and cartographic representation of spatial information from the public, giving rise to the idea of using Volunteered Geographic Information (VGI) to aid disaster management. In this study, with a brief review on the concept and the development of disaster management, opportunities and challenges for applying VGI in disaster management were explored. The challenges, including Data availability, Data quality, Data management and Legal issues of using VGI for disaster management, were discussed in detail with particular emphasis on the actual needs of disaster management practice in China. Three different approaches to assure VGI data quality, namely the classification and authority design of volunteers, a government-led VGI data acquisition framework for disaster management and a quality assessment system for VGI, respectively, were presented and discussed. As a case study, a prototype of VGI oriented earthquake disaster databank & sharing platform, an open WebGIS system for volunteers and other interested individuals collaboratively create and manage the earthquake disaster related information, was proposed, to provide references for improving the level of earthquake emergency response and disaster mitigation in China.

  10. Software Toolbox Development for Rapid Earthquake Source Optimisation Combining InSAR Data and Seismic Waveforms

    Science.gov (United States)

    Isken, Marius P.; Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Bathke, Hannes M.

    2017-04-01

    We present a modular open-source software framework (pyrocko, kite, grond; http://pyrocko.org) for rapid InSAR data post-processing and modelling of tectonic and volcanic displacement fields derived from satellite data. Our aim is to ease and streamline the joint optimisation of earthquake observations from InSAR and GPS data together with seismological waveforms for an improved estimation of the ruptures' parameters. Through this approach we can provide finite models of earthquake ruptures and therefore contribute to a timely and better understanding of earthquake kinematics. The new kite module enables a fast processing of unwrapped InSAR scenes for source modelling: the spatial sub-sampling and data error/noise estimation for the interferogram is evaluated automatically and interactively. The rupture's near-field surface displacement data are then combined with seismic far-field waveforms and jointly modelled using the pyrocko.gf framwork, which allows for fast forward modelling based on pre-calculated elastodynamic and elastostatic Green's functions. Lastly the grond module supplies a bootstrap-based probabilistic (Monte Carlo) joint optimisation to estimate the parameters and uncertainties of a finite-source earthquake rupture model. We describe the developed and applied methods as an effort to establish a semi-automatic processing and modelling chain. The framework is applied to Sentinel-1 data from the 2016 Central Italy earthquake sequence, where we present the earthquake mechanism and rupture model from which we derive regions of increased coulomb stress. The open source software framework is developed at GFZ Potsdam and at the University of Kiel, Germany, it is written in Python and C programming languages. The toolbox architecture is modular and independent, and can be utilized flexibly for a variety of geophysical problems. This work is conducted within the BridGeS project (http://www.bridges.uni-kiel.de) funded by the German Research Foundation DFG

  11. Earthquakes

    Science.gov (United States)

    ... earthquake occurs in a populated area, it may cause property damage, injuries, and even deaths. If you live in a coastal area, there is the possibility of a tsunami. Damage from earthquakes can also lead to floods or fires. Although there are no guarantees of ...

  12. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China

    Directory of Open Access Journals (Sweden)

    Shasha Li

    2017-03-01

    Full Text Available Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.

  13. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China.

    Science.gov (United States)

    Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang

    2017-03-07

    Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual's earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.

  14. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China

    Science.gov (United States)

    Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang

    2017-01-01

    Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded. PMID:28272359

  15. Utilizing Information Technology to Facilitate Rapid Acquisition

    Science.gov (United States)

    2006-06-01

    ordering systems to facilitate streamlined commercial item acquisitions that reap the benefits of improved efficiency, reduced overall costs, and...PAGES 109 14. SUBJECT TERMS Rapid Acquisition, eCommerce , eProcurement, Information Technology, Contracting, Global Information Network...streamlined commercial item acquisitions that reap the benefits of improved efficiency, reduced overall costs, and timeliness. This thesis

  16. Information for action? Analysis of 2005 South Asian earthquake reports posted on Reliefweb.

    Science.gov (United States)

    von Schreeb, Johan; Legha, Jaswinder K; Karlsson, Niklas; Garfield, Richard

    2013-06-01

    Following a sudden-onset disaster (SOD), rapid information is needed. We assessed the relevance of information available for relief planning on a main Internet portal following a major SOD. We reviewed all information posted on the Reliefweb Web site in the first 7 days following the 2005 South Asian earthquake using a predeveloped registration form focusing on essential indicators. These data were compared with Pakistani government figures posted by the Centre for Research on the Epidemiology of Disasters. A total of 820 reports were reviewed. More reports came from nongovernmental organizations (NGOs; 35%) than any other source. A total of 42% of reports addressed only national level information, while 32% specified information at the provincial level. Fewer than 12% of all reports discussed the earthquake at the more local division and district levels. Only 13 reports provided pre-earthquake estimates of the number of people living in the affected areas. A third of all reports cited a common figure of 2.5 million made homeless. These were lower than official figures of 5 million homeless. A total of 43% reported on the estimated number of deaths. The estimated number peaked on day 4 at 40 000. All of these reports were lower than official data, which reported 73 000 deaths in total. Early reports heavily underestimated the number of affected, homeless, injured, and dead. Many reports repeated information provided from previous unnamed sources rather than providing unique contributions from eyewitness reports or from contextual information based on previous work in the area. Better information on predisaster essential indicators should be available and used in combination with post-SOD information to better adapt humanitarian relief and funding according to needs.

  17. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  18. Earthquakes

    Science.gov (United States)

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  19. Rapid repair techniques for severely earthquake-damaged circular bridge piers with flexural failure mode

    Science.gov (United States)

    Sun, Zhiguo; Li, Hongnan; Bi, Kaiming; Si, Bingjun; Wang, Dongsheng

    2017-04-01

    In this study, three rapid repair techniques are proposed to retrofit circular bridge piers that are severely damaged by the flexural failure mode in major earthquakes. The quasi-static tests on three 1:2.5 scaled circular pier specimens are conducted to evaluate the efficiency of the proposed repair techniques. For the purpose of rapid repair, the repair procedure for all the specimens is conducted within four days, and the behavior of the repaired specimens is evaluated and compared with the original ones. A finite element model is developed to predict the cyclic behavior of the repaired specimens and the numerical results are compared with the test data. It is found that all the repaired specimens exhibit similar or larger lateral strength and deformation capacity than the original ones. The initial lateral stiffness of all the repaired specimens is lower than that of the original ones, while they show a higher lateral stiffness at the later stage of the test. No noticeable difference is observed for the energy dissipation capacity between the original and repaired pier specimens. It is suggested that the repair technique using the early-strength concrete jacket confined by carbon fiber reinforced polymer (CFRP) sheets can be an optimal method for the rapid repair of severely earthquake-damaged circular bridge piers with flexural failure mode.

  20. EDIM - Earthquake Disaster Information System for the Marmara Region, Turkey

    Science.gov (United States)

    Wenzel, Friedemann; Erdik, Mustafa; Zschau, Jochen; Fischer, Joachim; Christ, Ingrid; Kiehle, Christian

    2010-05-01

    The main objectives of EDIM (www.cedim.de/EDIM.php) are to enhance the Istanbul earthquake early warning (EEW) system with a number of scientific and technological developments that - in the end - provide a tool set for EEW with wide applicability. Innovations focus on three areas. (1) Analysis and options for improvement of the current system; (2) development of a new type of self-organising sensor system and its application to early warning; (3) development of a geoinformation infrastructure and geoinformation system tuned to early warning purposes. Development in the frame of the Istanbul system, set up and operated by KOERI, allows testing our novel methods and techniques in an operational system environment and working in a partnership with a long-standing traditon of success. EDIM is a consortium of Karlsruhe University (TH), GeoForschungsZentrum (GFZ) Potsdam, Humboldt University (HU) Berlin, lat/lon GmbH Bonn, DELPHI Informations Muster Management GmbH Potsdam, and Kandilli Observatory and Earthquake Research Institute (KOERI) of the Bogazici University in Istanbul. The integration of strong motion seismology, sensor system hard- and software development, and geoinformation real-time management tools prove a successful concept in making seismic early warning a novel technology with high potential for scientific and technological innovation, disaster mitigation, and many spin-offs for other fields. EDIM can serve as a model for further developments in the field of early warning on a global scale.

  1. Turning the rumor of the May 11, 2011, earthquake prediction in Rome, Italy, into an information day on earthquake hazard

    Directory of Open Access Journals (Sweden)

    Concetta Nostro

    2012-07-01

    Full Text Available A devastating earthquake was predicted to hit Rome on May 11, 2011. This prediction was never officially released, but it grew on the internet and was amplified by the media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment, and this fed the credibility of the earthquake prediction. During the months preceding May 2011, the Istituto Nazionale di Geofisica e Vulcanologia (INGV was overwhelmed with requests for information about this prediction, by the inhabitants of Rome and by tourists. Given the echo of this earthquake prediction, on May 11, 2011, the INGV decided to organize an Open Day at its headquarters in Rome, to inform the public about Italian seismicity and earthquake physics. The Open Day was preceded by a press conference two days before, to talk with journalists about this prediction, and to present the Open Day. During this ‘Day’, 13 new videos were also posted on our YouTube/INGVterremoti channel to explain earthquake processes and hazards, and to provide periodic updates on seismicity in Italy from the seismicity monitoring room. On May 11, 2011, the INGV headquarters was peacefully invaded by over 3,000 visitors, from 10:00 am to 9:00 pm: families, students with and without teachers, civil protection groups, and many journalists. This initiative that was built up in a few weeks has had very large feedback, and was a great opportunity to talk with journalists and people about earthquake prediction, and more in general about the seismic risk in Italy.

  2. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    Science.gov (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  3. The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake

    Science.gov (United States)

    Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

    2008-12-01

    In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

  4. Preparing for an Earthquake: Information for Schools and Families

    Science.gov (United States)

    Heath, Melissa Allen; Dean, Brenda

    2008-01-01

    Over the past decade, catastrophic earthquakes have garnered international attention regarding the need for improving immediate and ongoing support services for disrupted communities. Following the December 26, 2004 Indonesian earthquake, the Indian Ocean tsunami was responsible for displacing millions and taking the lives of an estimated 320,000…

  5. Letter to the Editor : Rapidly-deployed small tent hospitals: lessons from the earthquake in Haiti.

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, Y.; Gurman , P.; Verna, E.; Elman , N.; Labor, E. (Materials Science Division); (Superior NanoBioSystems LLC); (Fast Israeli Rescue & Search Team); (Clinique Adonai); (Mass. Inst. Tech.); (Univ. Haifa)

    2012-06-01

    The damage to medical facilities resulting form the January 2010 earthquake in haiti necessitated the establishment of field tent hospitals. Much of the local medical infrastructure was destroyed or limited operationally when the Fast Israel Rescue and Search Team (FIRST) arrived in Haiti shortly after the January 2010 earthquake. The FIRST deployed small tent hospitals in Port-au-Prince and in 11 remote areas outside of the city. Each tent was set up in less than a half hour. The tents were staffed with an orthopedic surgeon, gynecologists, primary care and emergency care physicians, a physician with previous experience in tropical medicine, nurses, paramedics, medics, and psychologists. The rapidly deployable and temporary nature of the effort allowed the team to treat and educate, as well as provide supplies for, thousands of refugees throughout Haiti. In addition, a local Haitian physician and his team created a small tent hospital to serve the Petion Refugee Camp and its environs. FIRST personnel also took shifts at this hospital.

  6. Time-series analysis of earthquake sequences by means of information recognizer

    Science.gov (United States)

    Vogel, E. E.; Saravia, G.; Pastén, D.; Muñoz, V.

    2017-08-01

    Three seismic sequences of several thousand earthquakes each are analyzed by means of a tunable information recognizer known as wlzip. These sequences are different both in the geographical coverage and the time span, including earthquakes of magnitude larger than 8.0. The main variable under scrutiny here is the time interval between consecutive events. Two parameters (mutability and interval dilation) are defined for each sequence, which relate to the information contained in it. In this way it is possible to characterize different regimes in the seismic activity. For instance, mutability increases before large earthquakes and decreases sharply immediately after each of these events. On the other hand, interval dilation reaches a clear maximum several months before major earthquakes, while it decreases to its lowest possible value after such earthquakes during the aftershock regime. Extensions of the application of this new method to other problems in seismicity are mentioned.

  7. Interactive terrain visualization enables virtual field work during rapid scientific response to the 2010 Haiti earthquake

    Science.gov (United States)

    Cowgill, Eric; Bernardin, Tony S.; Oskin, Michael E.; Bowles, Christopher; Yikilmaz, M. Burak; Kreylos, Oliver; Elliott, Austin J.; Bishop, Scott; Gold, Ryan D.; Morelan, Alexander; Bawden, Gerald W.; Hamann, Bernd; Kellogg, Louise

    2012-01-01

    The moment magnitude (Mw) 7.0 12 January 2010 Haiti earthquake is the first major earthquake for which a large-footprint LiDAR (light detection and ranging) survey was acquired within several weeks of the event. Here, we describe the use of virtual reality data visualization to analyze massive amounts (67 GB on disk) of multiresolution terrain data during the rapid scientific response to a major natural disaster. In particular, we describe a method for conducting virtual field work using both desktop computers and a 4-sided, 22 m3 CAVE immersive virtual reality environment, along with KeckCAVES (Keck Center for Active Visualization in the Earth Sciences) software tools LiDAR Viewer, to analyze LiDAR point-cloud data, and Crusta, for 2.5 dimensional surficial geologic mapping on a bare-earth digital elevation model. This system enabled virtual field work that yielded remote observations of the topographic expression of active faulting within an ∼75-km-long section of the eastern Enriquillo–Plantain Garden fault spanning the 2010 epicenter. Virtual field observations indicated that the geomorphic evidence of active faulting and ancient surface rupture varies along strike. Landform offsets of 6–50 m along the Enriquillo–Plantain Garden fault east of the 2010 epicenter and closest to Port-au-Prince attest to repeated recent surface-rupturing earthquakes there. In the west, the fault trace is well defined by displaced landforms, but it is not as clear as in the east. The 2010 epicenter is within a transition zone between these sections that extends from Grand Goâve in the west to Fayette in the east. Within this transition, between L'Acul (lat 72°40′W) and the Rouillone River (lat 72°35′W), the Enriquillo–Plantain Garden fault is undefined along an embayed low-relief range front, with little evidence of recent surface rupture. Based on the geometry of the eastern and western faults that show evidence of recent surface rupture, we propose that the 2010

  8. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    Science.gov (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  9. Twitter earthquake detection: earthquake monitoring in a social world

    National Research Council Canada - National Science Library

    Daniel C. Bowden; Paul S. Earle; Michelle Guy

    2011-01-01

    ... messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages...

  10. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  11. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    A devastating earthquake had been predicted for May 11, 2011 in Rome. This prediction was never released officially by anyone, but it grew up in the Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions. Indeed, around May 11, 2011, a planetary alignment was really expected and this contributed to give credibility to the earthquake prediction among people. During the previous months, INGV was overwhelmed with requests for information about this supposed prediction by Roman inhabitants and tourists. Given the considerable mediatic impact of this expected earthquake, INGV decided to organize an Open Day in its headquarter in Rome for people who wanted to learn more about the Italian seismicity and the earthquake as natural phenomenon. The Open Day was preceded by a press conference two days before, in which we talked about this prediction, we presented the Open Day, and we had a scientific discussion with journalists about the earthquake prediction and more in general on the real problem of seismic risk in Italy. About 40 journalists from newspapers, local and national tv's, press agencies and web news attended the Press Conference and hundreds of articles appeared in the following days, advertising the 11 May Open Day. The INGV opened to the public all day long (9am - 9pm) with the following program: i) meetings with INGV researchers to discuss scientific issues; ii) visits to the seismic monitoring room, open 24h/7 all year; iii) guided tours through interactive exhibitions on earthquakes and Earth's deep structure; iv) lectures on general topics from the social impact of rumors to seismic risk reduction; v) 13 new videos on channel YouTube.com/INGVterremoti to explain the earthquake process and give updates on various aspects of seismic monitoring in Italy; vi) distribution of books and brochures. Surprisingly, more than 3000 visitors came to visit INGV

  12. Multi-scenario analysis: a new hybrid approach to inform earthquake disaster risk planning

    Science.gov (United States)

    Robinson, Tom; Rosser, Nick

    2017-04-01

    Current earthquake risk assessments take one of two approaches: deterministic (scenario) or probabilistic, but both have notable limitations. Deterministic approaches are limited by a focus on a single scenario, as the results of the analysis are only relevant to the scenario selected, which is unlikely to represent the earthquake that occurs next, nor its impacts. Alternatively, probabilistic approaches are sensitive to the completeness of evidence of past earthquakes, which is inadequate in most seismically-active parts of the world. Consequently, earthquake risk assessments have failed to inform planning prior to major earthquakes such as the 2005 Kashmir and 2008 Wenchuan disasters. This study presents a new hybrid approach for earthquake risk assessments that maintains the high detail of deterministic approaches but considers numerous scenarios simultaneously, similar to probabilistic approaches. The aim of such an approach is to identify impacts that recur in multiple scenarios, or impacts that occur irrespective of the given scenario. Such recurring impacts can be considered the most likely consequences to occur in the next earthquake, despite the precise details of the next earthquake remaining unknown. To demonstrate this, we apply the method to Nepal, one of the most seismically at-risk nations in the world. We model 30 different potential earthquake scenarios throughout the country with magnitude ranges 8.6 to 7.0 for three different times of day (night-time, mid-week day-time, weekend day-time) for a total of 90 different scenarios. By combining the results from each scenario for individual districts, we are able to assess which districts are most at risk of losses in the next earthquake. By focussing on fatalities as a percentage of total population, we rank each district by its: (a) median modelled fatalities; (b) percentage of scenarios with >0 fatalities; (c) inter-quartile range of modelled fatalities; and (d) maximum modelled fatalities. Combining

  13. Rapid monitoring in vaccination campaigns during emergencies: the post-earthquake campaign in Haiti.

    Science.gov (United States)

    Rainey, Jeanette J; Sugerman, David; Brennan, Muireann; Cadet, Jean Ronald; Ernsly, Jackson; Lacapère, François; Danovaro-Holliday, M Carolina; Mubalama, Jean-Claude; Nandy, Robin

    2013-12-01

    The earthquake that struck Haiti in January 2010 caused 1.5 million people to be displaced to temporary camps. The Haitian Ministry of Public Health and Population and global immunization partners developed a plan to deliver vaccines to those residing in these camps. A strategy was needed to determine whether the immunization targets set for the campaign were achieved. Following the vaccination campaign, staff from the Ministry of Public Health and Population interviewed convenience samples of households - in specific predetermined locations in each of the camps - regarding receipt of the emergency vaccinations. A camp was targeted for "mop-up vaccination" - i.e. repeat mass vaccination - if more than 25% of the children aged 9 months to 7 years in the sample were found not to have received the emergency vaccinations. Rapid monitoring was implemented in camps located in the Port-au-Prince metropolitan area. Camps that housed more than 5000 people were monitored first. By the end of March 2010, 72 (23%) of the 310 vaccinated camps had been monitored. Although 32 (44%) of the monitored camps were targeted for mop-up vaccination, only six of them had received such repeat mass vaccination when checked several weeks after monitoring. Rapid monitoring was only marginally beneficial in achieving immunization targets in the temporary camps in Port-au-Prince. More research is needed to evaluate the utility of conventional rapid monitoring, as well as other strategies, during post-disaster vaccination campaigns that involve mobile populations, particularly when there is little capacity to conduct repeat mass vaccination.

  14. Rapid assessment survey of earthquake affected Bhuj block of Kachchh District,FNx01 Gujarat, India

    OpenAIRE

    Pawar A; Shelke S; Kakrani V

    2005-01-01

    RESEARCH QUESTIONS: How much human loss would have caused by the earthquake in Bhuj block? What is the environmental sanitation status? OBJECTIVES: (1) To assess human loss and injuries after the earthquake in Bhuj block.(2) To study the status of some relief activities.(3) To study the environmental sanitation status of the earthquake affected Bhuj block. STUDY DESIGN: Cross-sectional study. SETTINGS: Bhuj block. Participants: All villages excluding Bhuj city of Bhuj block. Statist...

  15. Earthquake Magnitude and Shaking Intensity Dependent Fragility Functions for Rapid Risk Assessment of Buildings

    Directory of Open Access Journals (Sweden)

    Marie-José Nollet

    2018-01-01

    Full Text Available An integrated web application, referred to as ER2 for rapid risk evaluator, is under development for a user-friendly seismic risk assessment by the non-expert public safety community. The assessment of likely negative consequences is based on pre-populated databases of seismic, building inventory and vulnerability parameters. To further accelerate the computation for near real-time analyses, implicit building fragility curves were developed as functions of the magnitude and the intensity of the seismic shaking defined with a single intensity measure, input spectral acceleration at 1.0 s implicitly considering the epicentral distance and local soil conditions. Damage probabilities were compared with those obtained with the standard fragility functions explicitly considering epicentral distances and local site classes in addition to the earthquake magnitudes and respective intensity of the seismic shaking. Different seismic scenarios were considered first for 53 building classes common in Eastern Canada, and then a reduced number of 24 combined building classes was proposed. Comparison of results indicate that the damage predictions with implicit fragility functions for short (M ≤ 5.5 and medium strong motion duration (5.5 < M ≤ 7.5 show low variation with distance and soil class, with average error of less than 3.6%.

  16. ShakeMap implementation for Pyrenees in France-Spain border: regional adaptation and earthquake rapid response process.

    OpenAIRE

    Bertil, Didier; Roviró, Jordi; Antonio Jara, Jose; Susagna, Teresa; Nus, Eduard; Goula, Xavier; Colas, Bastien; Dumont, Guillaume; Cabañas, Luis; Anton, Resurección; Calvet, Marie

    2012-01-01

    International audience; The USGS-ShakeMap package is used with a regional adaptation to provide automatic shake maps in rapid response for Pyrenean earthquakes. The Near Real Time system relies on servers designed for data exchange between transborder organizations involved in the Sispyr project. First maps will be provide as soon as possible after the shock, and updated with observed macroseismic intensities on the following hours. Regional Predictive Equations Tapia (2006) and Goula et al. ...

  17. Equilibrium Decision Method for Earthquake First-Aid Medicine Allocation Based on Demand Information Updating

    Directory of Open Access Journals (Sweden)

    Yong Ye

    2017-01-01

    Full Text Available The allocation of rescue resources after an earthquake has become a popular research topic in the field of emergency management. The allocation of first-aid medicine for earthquake rescue has stronger time sensitivity than that of general rescue materials. This study focuses on the problem of first-aid medicine allocation in earthquake response. First, we consider the incompleteness and renewal of decision information in an emergency environment, as well as the balance between the risk of decision error and delay. Second, we propose an equilibrium decision method for the allocation of first-aid medicine in earthquake rescue based on information update. This method attempts to realize a fair allocation to all disaster places and minimize total transport time loss. Third, a simulation analysis is performed in which the proposed method is applied to the first-aid medicine allocation problem in the Wenchuan earthquake response. Results show that the method can be used to create a good allocation plan in an earthquake rescue situation.

  18. Today and Tomorrow of the Real-time Earthquake Information Equipment

    Science.gov (United States)

    Nakamura, Y.

    2003-12-01

    UrEDAS, Urgent Earthquake Detection and Alarm System, can realize the real-time early earthquake detection and alarm system in the world. Although this system is actually working for mostly railroad relations, such as the Shinkansen and subway lines, it is not the system limited to the railroad field. For example, there is a local government that has realized a tsunami warning system using real-time estimated earthquake parameters as magnitude and location, distributed by UrEDAS. UrEDAS is characterized by the serial processing without storage of seismic waveform for processing. For this reason, the procedure of data processing hardly changes with usual operation also in case of an earthquake, so the system does not carry out a system failure in case of an earthquake. And also UrEDAS does not require a network and is an autonomous distributed system strong against a natural disaster or cyber-terrorism. On 26 May 2003, the Sanriku-Minami earthquake of Mj 7.0 was occurred. It was so large that the maximum acceleration of about 600 Gal was observed along the Shinkansen line and 22 columns of the rigid frame viaducts (RC) were severely cracked. This earthquake occurred on the business hours of the Shinkansen. As expected, coastline _gCompact UrEDAS_h took out the early P wave alarm before the destructive earthquake motion and the validity of this system was proved for the first time. UrEDAS on the place where many faults exist has a problem in accuracy, especially for the epicentral azimuth. UrEDAS has been observing to consider on the situations of operation under such an unfavorable condition and tried to shorten the calculation time and improve the accuracy. On the other hand, UrEDAS has examined to distribute the earthquake information via Internet. At the time of Colima, Mexico earthquake on January 2003, UrEDAS in Mexico City detected this earthquake over one minute before the large motion and sent an information for persons concerned. The above systems are large

  19. Earthquake ethics through scientific knowledge, historical memory and societal awareness: the experience of direct internet information.

    Science.gov (United States)

    de Rubeis, Valerio; Sbarra, Paola; Sebaste, Beppe; Tosi, Patrizia

    2013-04-01

    The experience of collection of data on earthquake effects and diffusion of information to people, carried on through the site "haisentitoilterremoto.it" (didyoufeelit) managed by the Istituto Nazionale di Geofisica e Vulcanologia (INGV), has evidenced a constantly growing interest by Italian citizens. Started in 2007, the site has collected more than 520,000 compiled intensity questionnaires, producing intensity maps of almost 6,000 earthquakes. One of the most peculiar feature of this experience is constituted by a bi-directional information exchange. Every person can record observed effects of the earthquake and, at the same time, look at the generated maps. Seismologists, on the other side, can find each earthquake described in real time through its effects on the whole territory. In this way people, giving punctual information, receive global information from the community, mediated and interpreted by seismological knowledge. The relationship amongst seismologists, mass media and civil society is, thus, deep and rich. The presence of almost 20,000 permanent subscribers distributed on the whole Italian territory, alerted in case of earthquake, has reinforced the participation: the subscriber is constantly informed by the seismologists, through e-mail, about events occurred in his-her area, even if with very small magnitude. The "alert" service provides the possibility to remember that earthquakes are a phenomenon continuously present, on the other hand it shows that high magnitude events are very rare. This kind of information is helpful as it is fully complementary to that one given by media. We analyze the effects of our activity on society and mass media. The knowledge of seismic phenomena is present in each person, having roots on fear, idea of death and destruction, often with the deep belief of very rare occurrence. This position feeds refusal and repression. When a strong earthquake occurs, surprise immediately changes into shock and desperation. A

  20. The Canterbury Tales: Lessons from the Canterbury Earthquake Sequence to Inform Better Public Communication Models

    Science.gov (United States)

    McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.

    2015-12-01

    This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of

  1. The impact of Bam earthquake on substance users in the first 2 weeks: a rapid assessment.

    Science.gov (United States)

    Movaghar, Afarin Rahimi; Goodarzi, Reza Rad; Izadian, Elaheh; Mohammadi, Mohammad Reza; Hosseini, Mehdi; Vazirian, Mohsen

    2005-09-01

    In the final days of the year 2003, an earthquake in the city of Bam, Iran, led to the death of some 35,000 of its inhabitants. The rate of opium abuse, which had been high among the male population in this city, caused problems after the earthquake. The aim of the following study was to examine the status of substance abusers during the first 2 weeks after the earthquake. The study was carried out in the city of Bam, one of its nearby villages and eight hospitals admitting earthquake victims. One hundred and sixty-three individuals were interviewed, including substance abusers, their family members, people living in Bam, service providers, and the authorities. During the first 2 weeks after the earthquake, about half of drug-dependent interviewees suffered from withdrawal symptoms. About half reported their problems to health care providers and asked for morphine or other analgesics. Around one third had used opium on the first day and two thirds in the course of the second day to the end of the second week after the earthquake. Although smoking had been the most common means of abuse before the earthquake, oral intake has become the most prevalent route after the disaster. Almost all obtained their opium from inhabitants of other cities as gifts. Members of rescue and health delivery systems had a lot of encounters with opium abusers, especially in the first 3 days after the earthquake, and had prescribed morphine and other analgesics. In societies with a considerable prevalence of substance abuse, this issue becomes a matter of utmost health care and social importance at times of disasters, and the necessary arrangements to deal with it should be present beforehand.

  2. Emergency mapping and information management during Nepal Earthquake 2015 - Challenges and lesson learned

    Science.gov (United States)

    Joshi, G.; Gurung, D. R.

    2016-12-01

    A powerful 7.8 magnitude earthquake struck Nepal at 06:11 UTC on 25 April 2015. Several subsequent aftershocks were deadliest earthquake in recent history of Nepal. In total about 9000 people died and 22,300 people were injured, and lives of eight million people, almost one-third of the population of Nepal was effected. The event lead to massive campaigned to gather data and information on damage and loss using remote sensing, field inspection, and community survey. Information on distribution of relief materials is other important domain of information necessary for equitable relief distribution. Pre and post-earthquake high resolution satellite images helped in damage area assessment and mapping. Many national and international agencies became active to generate and fill the information vacuum. The challenges included data access bottleneck due to lack of good IT infrastructure; inconsistent products due to absence of standard mapping guidelines; dissemination challenges due to absence of Standard Operating Protocols and single information gateway. These challenges were negating opportunities offered by improved earth observation data availability, increasing engagement of volunteers for emergency mapping, and centralized emergency coordination practice. This paper highlights critical practical challenges encountered during emergency mapping and information management during the earthquake in Nepal. There is greater need to address such challenges to effectively use technological leverages that recent advancement in space science, IT and mapping domain provides.

  3. Rapid assessment survey of earthquake affected Bhuj block of Kachchh District, Gujarat, India.

    Science.gov (United States)

    Pawar, A T; Shelke, S; Kakrani, V A

    2005-11-01

    How much human loss would have caused by the earthquake in Bhuj block? What is the environmental sanitation status? (1) To assess human loss and injuries after the earthquake in Bhuj block.(2) To study the status of some relief activities.(3) To study the environmental sanitation status of the earthquake affected Bhuj block. Cross-sectional study. Bhuj block. All villages excluding Bhuj city of Bhuj block. Proportions, chi-square test, chi-square for trend. Survey was done in 144 villages; there were total 541 deaths with death rate of 3.18 per 1000 population. Death rate was significantly associated with distance of village from epicenter (chi-square for trend significant, P open-air defecation was practiced. Diseases such as URTIs, diarrheal diseases, fever and conjunctivitis were commonly observed in the field area.

  4. Seeking Information after the 2010 Haiti Earthquake: A Case Study in Mass-Fatality Management

    Science.gov (United States)

    Gupta, Kailash

    2013-01-01

    The 2010 earthquake in Haiti, which killed an estimated 316,000 people, offered many lessons in mass-fatality management (MFM). The dissertation defined MFM in seeking information and in recovery, preservation, identification, and disposition of human remains. Specifically, it examined how mass fatalities were managed in Haiti, how affected…

  5. K12 Education Program Lessons Learned at the Center for Earthquake Research and Information

    Science.gov (United States)

    Patterson, G. L.; Dry, M.

    2003-12-01

    The Center for Earthquake Research and Information at the University of Memphis has been committed to increasing awareness for Seismic Hazard, Earthquake Engineering, and Earth Science among Mid-America's policy-makers, engineers, emergency managers, the general public, and K-12 teachers and students for nearly three decades. During that time we have learned many lessons related to providing effective education and outreach programs, especially for K-12 students. The lessons learned from these activities may be particularly appropriate for other regions where large earthquakes occur infrequently but have disproportionately high consequence areas due to low attenuation of seismic waves. Effective education programs in these settings must provide a consistent message across many states to a wide variety of socio-economic groups and professional communities through the leveraged resources of various groups and agencies. It is also beneficial to hire and train staff with K-12 teaching experience to work directly K-12 education organizations, and science curriculum coordinators.

  6. Application and evaluation of a rapid response earthquake-triggered landslide model to the 25 April 2015 Mw 7.8 Gorkha earthquake, Nepal

    Science.gov (United States)

    Gallen, Sean F.; Clark, Marin K.; Godt, Jonathan W.; Roback, Kevin; Niemi, Nathan A.

    2017-09-01

    The 25 April 2015 Mw 7.8 Gorkha earthquake produced strong ground motions across an approximately 250 km by 100 km swath in central Nepal. To assist disaster response activities, we modified an existing earthquake-triggered landslide model based on a Newmark sliding block analysis to estimate the extent and intensity of landsliding and landslide dam hazard. Landslide hazard maps were produced using Shuttle Radar Topography Mission (SRTM) digital topography, peak ground acceleration (PGA) information from the U.S. Geological Survey (USGS) ShakeMap program, and assumptions about the regional rock strength based on end-member values from previous studies. The instrumental record of seismicity in Nepal is poor, so PGA estimates were based on empirical Ground Motion Prediction Equations (GMPEs) constrained by teleseismic data and felt reports. We demonstrate a non-linear dependence of modeled landsliding on aggregate rock strength, where the number of landslides decreases exponentially with increasing rock strength. Model estimates are less sensitive to PGA at steep slopes (> 60°) compared to moderate slopes (30-60°). We compare forward model results to an inventory of landslides triggered by the Gorkha earthquake. We show that moderate rock strength inputs over estimate landsliding in regions beyond the main slip patch, which may in part be related to poorly constrained PGA estimates for this event at far distances from the source area. Directly above the main slip patch, however, the moderate strength model accurately estimates the total number of landslides within the resolution of the model (landslides ≥ 0.0162 km2; observed n = 2214, modeled n = 2987), but the pattern of landsliding differs from observations. This discrepancy is likely due to the unaccounted for effects of variable material strength and local topographic amplification of strong ground motion, as well as other simplifying assumptions about source characteristics and their relationship to

  7. Medical Information & Technology: Rapidly Expanding Vast Horizons

    Science.gov (United States)

    Sahni, Anil K.

    2012-12-01

    During ÑMedical Council Of India?, Platinum Jubilee Year (1933-2008) Celebrations, In Year 2008, Several Scientific Meeting/Seminar/Symposium, On Various Topics Of Contemporary Importance And Relevance In The Field Of ÑMedical Education And Ethics?, Were Organized, By Different Medical Colleges At Various Local, State, National Levels. The Present Discussion, Is An Comprehensive Summary Of Various Different Aspects of ìMedical Information Communication Technologyî, Especially UseFul For The Audience Stratum Group Of Those Amateur Medical & Paramedical Staff, With No Previous Work Experience Knowledge Of Computronics Applications. Outlining The, i.Administration Applications: Medical Records Etc, ii. Clinical Applications: Pros pective Scope Of TeleMedicine Applicabilities Etc iii. Other Applications: Efforts To Augment Improvement Of Medical Education, Medical Presentations, Medical Education And Research Etc. ÑMedical Trancription? & Related Recent Study Fields e.g ÑModern Pharmaceuticals?,ÑBio-Engineering?, ÑBio-Mechanics?, ÑBio-Technology? Etc., Along With Important Aspects Of Computers-General Considerations, Computer Ergonomics Assembled To Summarize, The AwareNess Regarding Basic Fundamentals Of Medical Computronics & Its Practically SuccessFul Utilities.

  8. Using JavaScript and the FDSN web service to create an interactive earthquake information system

    Science.gov (United States)

    Fischer, Kasper D.

    2015-04-01

    The FDSN web service provides a web interface to access earthquake meta-data (e. g. event or station information) and waveform date over the internet. Requests are send to a server as URLs and the output is either XML or miniSEED. This makes it hard to read by humans but easy to process with different software. Different data centers are already supporting the FDSN web service, e. g. USGS, IRIS, ORFEUS. The FDSN web service is also part of the Seiscomp3 (http://www.seiscomp3.org) software. The Seismological Observatory of the Ruhr-University switched to Seiscomp3 as the standard software for the analysis of mining induced earthquakes at the beginning of 2014. This made it necessary to create a new web-based earthquake information service for the publication of results to the general public. This has be done by processing the output of a FDSN web service query by javascript running in a standard browser. The result is an interactive map presenting the observed events and further information of events and stations on a single web page as a table and on a map. In addition the user can download event information, waveform data and station data in different formats like miniSEED, quakeML or FDSNxml. The developed code and all used libraries are open source and freely available.

  9. APhoRISM FP7 project: the A Priori information for Earthquake damage mapping method

    Science.gov (United States)

    Bignami, Christian; Stramondo, Salvatore; Pierdicca, Nazzareno

    2014-05-01

    The APhoRISM - Advanced PRocedure for volcanIc and Seismic Monitoring - project is an FP7 funded project, which aims at developing and testing two new methods to combine Earth Observation satellite data from different sensors, and ground data for seismic and volcanic risk management. The objective is to demonstrate that this two types of data, appropriately managed and integrated, can provide new improved products useful for seismic and volcanic crisis management. One of the two methods deals with earthquakes, and it concerns the generation of maps to address the detection and estimate of damage caused by a seism. The method is named APE - A Priori information for Earthquake damage mapping. The use of satellite data to investigate earthquake damages is not an innovative issue. Indeed, a wide literature and projects have addressed and focused such issue, but usually the proposed approaches are only based on change detection techniques and/or classifications algorithms. The novelty of APhoRISM-APE relies on the exploitation of a priori information derived by: - InSAR time series to measure surface movements - shakemaps obtained from seismological data - vulnerability information. This a priori information is then integrated with change detection map from earth observation satellite sensors (either Optical or Synthetic Aperture Radar) to improve accuracy and to limit false alarms.

  10. A Notation for Rapid Specification of Information Visualization

    Science.gov (United States)

    Lee, Sang Yun

    2013-01-01

    This thesis describes a notation for rapid specification of information visualization, which can be used as a theoretical framework of integrating various types of information visualization, and its applications at a conceptual level. The notation is devised to codify the major characteristics of data/visual structures in conventionally-used…

  11. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    Science.gov (United States)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  12. Mitigation of Bias in Inversion of Complex Earthquake without Prior Information of Detailed Fault Geometry

    Science.gov (United States)

    Kasahara, A.; Yagi, Y.

    2014-12-01

    Rupture process of earthquake derived from geophysical observations is important information to understand nature of earthquake and assess seismic hazard. Finite fault inversion is a commonly applied method to construct seismic source model. In conventional inversion, fault is approximated by a simple fault surface even if rupture of real earthquake should propagate along non-planar complex fault. In the conventional inversion, complex rupture kinematics is approximated by limited model parameters that only represent slip on a simple fault surface. This over simplification may cause biased and hence misleading solution. MW 7.7 left-lateral strike-slip earthquake occurred in southwestern Pakistan on 2013-09-24 might be one of exemplar event to demonstrate the bias. For this earthquake, northeastward rupture propagation was suggested by a finite fault inversion of teleseismic body and long period surface waves with a single planer fault (USGS). However, surface displacement field measured from cross-correlation of optical satellite images and back-projection imaging revealed that rupture was unilaterally propagated toward southwest on a non-planer fault (Avouac et.al., 2014). To mitigate the bias, more flexible source parameterization should be employed. We extended multi-time window finite fault method to represent rupture kinematics on a complex fault. Each spatio-temporal knot has five degrees of freedom and is able to represent arbitrary strike, dip, rake, moment release rate and CLVD component. Detailed fault geometry for a source fault is not required in our method. The method considers data covariance matrix with uncertainty of Green's function (Yagi and Fukahata, 2011) to obtain stable solution. Preliminary results show southwestward rupture propagation and focal mechanism change that is consistent with fault trace. The result suggests usefulness of the flexible source parameterization for inversion of complex events.

  13. Regulatory and Permitting Information Desktop (RAPID) Toolkit (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Young, K. R.; Levine, A.

    2014-09-01

    The Regulatory and Permitting Information Desktop (RAPID) Toolkit combines the former Geothermal Regulatory Roadmap, National Environmental Policy Act (NEPA) Database, and other resources into a Web-based tool that gives the regulatory and utility-scale geothermal developer communities rapid and easy access to permitting information. RAPID currently comprises five tools - Permitting Atlas, Regulatory Roadmap, Resource Library, NEPA Database, and Best Practices. A beta release of an additional tool, the Permitting Wizard, is scheduled for late 2014. Because of the huge amount of information involved, RAPID was developed in a wiki platform to allow industry and regulatory agencies to maintain the content in the future so that it continues to provide relevant and accurate information to users. In 2014, the content was expanded to include regulatory requirements for utility-scale solar and bulk transmission development projects. Going forward, development of the RAPID Toolkit will focus on expanding the capabilities of current tools, developing additional tools, including additional technologies, and continuing to increase stakeholder involvement.

  14. Rapid automated W-phase slip inversion for the Illapel great earthquake (2015, Mw = 8.3)

    Science.gov (United States)

    Benavente, Roberto; Cummins, Phil R.; Dettmer, Jan

    2016-03-01

    We perform rapid W-phase finite fault inversion for the 2015 Illapel great earthquake (Mw = 8.3). To evaluate the performance of the inversion in a near real time context, we divide seismic stations into four groups. The groups consider stations up to epicentral distances of 30°, 50°, 75°, and 90°, respectively. The results for the first group could have been available within 25 min after the origin time and the results for the last group within 1 h. The four results consistently show a peak slip of ˜10 m near the trench with trench perpendicular rake which is consistent with the tsunami genesis of the event. The slip location is similar to that in the preliminary U.S. Geological Survey solution. The inversion is automated and provides meaningful results within 25 min after the event. This makes the method particularly suited to emergency management and early warning at regional and teletsunami distances.

  15. Progress Towards Near-Realtime Seismic Moment Tensors at the Alaska Earthquake Information Center

    Science.gov (United States)

    Ratchkovski, N.; Hansen, R.

    2004-12-01

    A near-realtime seismic moment tensor inversion routine has been operational at the Alaska Earthquake Information Center (AEIC) in a test mode for over a year. The AEIC real-time earthquake detection system, based on the Antelope software package, triggers the automatic moment-tensor inversion routine. It is based on a software package developed at the Berkeley Seismological Laboratory and performs a time domain inversion of three-component seismic data for the seismic moment tensor. We use a library of precalculated Green's functions for a suite of regional velocity models and a range of source depths (from 5 to 200 km with 5 km interval) to compute synthetic seismograms. The resulting moment tensor inversion information is distributed via the web. The Alaska seismic network in its current configuration includes 45 broad-band sites. Stable inversion results can be obtained for events with magnitude 4.0 and greater in the network core area (southern and central Alaska) and 4.5 and greater in the rest of the state including the Aleutian Islands. We will present a catalog of nearly 200 regional moment tensor solutions for Alaska and Aleutian Islands starting from October, 2002 through the present including, the 2002 Denali Fault earthquake sequence.

  16. Rapid uncertainty estimation in finite fault inversion: Case study for the 2015, Mw 8.3 Illapel earthquake

    Science.gov (United States)

    Cummins, P. R.; Benavente, R. F.; Dettmer, J.; Williamson, A.

    2016-12-01

    Rapid estimation of the slip distribution for large earthquakes can be useful for the early phases of emergency response, in rapid impact assessment and tsunami early warning. Model parameter uncertainties can be crucial for meaningful interpretation of such slip models, but they are often ignored. However, estimation of uncertainty in linear finite fault inversion is difficult because of the positivity constraints that are almost always applied. We have shown in previous work that positivity can be realized by imposing a prior such that the logs of each subfault scalar moment are smoothly distributed on the fault surface, and each scalar moment is intrinsically non-negative while the posterior PDF can still be approximated as Gaussian. The inversion is nonlinear, but we showed that the most probable solution can be found by iterative methods that are not computationally demanding. In addition, the posterior covariance matrix (which provides uncertainties) can be estimated from the most probable solution, using an analytic expression for the Hessian of the cost function. We have studied this approach previously for synthetic W-phase data and showed that a first order estimation of the uncertainty in the slip model can be obtained.Here we apply this method to seismic W-phase recorded following the 2015, Mw 8.3 Illapel earthquake. Our results show a slip distrubtion with maximum slip near the subduction zone trench axis, and having uncertainties that scale roughly with the slip value. We also consider application of this method to multiple data types: seismic W-phase, geodetic, and tsunami.

  17. Rapid establishment of an internally displaced persons disease surveillance system after an earthquake --- Haiti, 2010.

    Science.gov (United States)

    2010-08-06

    On January 12, 2010, a 7.0-magnitude earthquake in Haiti disrupted infrastructure and displaced approximately 2 million persons, causing increased risk for communicable diseases from overcrowding and poor living conditions. Hundreds of nongovernmental organizations (NGOs) established health-care clinics in camps of internally displaced persons (IDPs). To monitor conditions of outbreak potential identified at NGO camp clinics, on February 18, the Haiti Ministry of Public Health and Population (MSPP), the Pan-American Health Organization (PAHO), and CDC implemented the IDP Surveillance System (IDPSS). The Inter-Agency Standing Committee (IASC) "cluster approach" was used to coordinate the Haiti humanitarian response. One of 11 clusters, the Global Health Cluster (GHC), builds global capacity, whereas the country-level cluster (in this case, the Haitian Health Cluster [HHC], led by PAHO) responds locally. During the Haiti response, HHC engaged NGOs serving large camps, established IDPSS, followed trends of reportable conditions, undertook epidemiologic and laboratory investigations, and fostered implementation of control measures. This report describes the design and implementation of IDPSS in the post-earthquake period. The primary challenges to implementing IDPSS were communication difficulties with an ever-changing group of NGO partners and limitations to the utility of IDPSS data because of lack of reliable camp population denominator estimates. The IDPSS experience reinforces the need to improve local communication and coordination strategies. Improving future humanitarian response requires advance development and distribution of easily adaptable standard surveillance tools, development of an interdisciplinary strategy for an early and reliable population census, and development of communication strategies using locally available Internet and cellular networks.

  18. Real time earthquake information and tsunami estimation system for Indonesia, Philippines and Central-South American regions

    Science.gov (United States)

    Pulido Hernandez, N. E.; Inazu, D.; Saito, T.; Senda, J.; Fukuyama, E.; Kumagai, H.

    2015-12-01

    Southeast Asia as well as Central-South American regions are within the most active seismic regions in the world. To contribute to the understanding of source process of earthquakes the National Research Institute for Earth Science and Disaster Prevention NIED maintains the international seismic Network (ISN) since 2007. Continuous seismic waveforms from 294 broadband seismic stations in Indonesia, Philippines, and Central-South America regions are received in real time at NIED, and used for automatic location of seismic events. Using these data we perform automatic and manual estimation of moment tensor of seismic events (Mw>4.5) by using the SWIFT program developed at NIED. We simulate the propagation of local tsunamis in these regions using a tsunami simulation code and visualization system developed at NIED, combined with CMT parameters estimated by SWIFT. The goals of the system are to provide a rapid and reliable earthquake and tsunami information in particular for large seismic, and produce an appropriate database of earthquake source parameters and tsunami simulations for research. The system uses the hypocenter location and magnitude of earthquakes automatically determined at NIED by the SeisComP3 system (GFZ) from the continuous seismic waveforms in the region, to perform the automated calculation of moment tensors by SWIFT, and then carry out the automatic simulation and visualization of tsunami. The system generates maps of maximum tsunami heights within the target regions and along the coasts and display them with the fault model parameters used for tsunami simulations. Tsunami calculations are performed for all events with available automatic SWIFT/CMT solutions. Tsunami calculations are re-computed using SWIFT manual solutions for events with Mw>5.5 and centroid depths shallower than 100 km. Revised maximum tsunami heights as well as animation of tsunami propagation are also calculated and displayed for the two double couple solutions by SWIFT

  19. Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Levine, Aaron L [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit presentation from the WPTO FY14-FY16 Peer Review. The toolkit is aimed at regulatory agencies, consultants, project developers, the public, and any other party interested in learning more about the hydropower regulatory process.

  20. Developing Collective Learning Extension for Rapidly Evolving Information System Courses

    Science.gov (United States)

    Agarwal, Nitin; Ahmed, Faysal

    2017-01-01

    Due to rapidly evolving Information System (IS) technologies, instructors find themselves stuck in the constant game of catching up. On the same hand students find their skills obsolete almost as soon as they graduate. As part of IS curriculum and education, we need to emphasize more on teaching the students "how to learn" while keeping…

  1. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  2. Multispectral, hyperspectral, and LiDAR remote sensing and geographic information fusion for improved earthquake response

    Science.gov (United States)

    Kruse, F. A.; Kim, A. M.; Runyon, S. C.; Carlisle, Sarah C.; Clasen, C. C.; Esterline, C. H.; Jalobeanu, A.; Metcalf, J. P.; Basgall, P. L.; Trask, D. M.; Olsen, R. C.

    2014-06-01

    The Naval Postgraduate School (NPS) Remote Sensing Center (RSC) and research partners have completed a remote sensing pilot project in support of California post-earthquake-event emergency response. The project goals were to dovetail emergency management requirements with remote sensing capabilities to develop prototype map products for improved earthquake response. NPS coordinated with emergency management services and first responders to compile information about essential elements of information (EEI) requirements. A wide variety of remote sensing datasets including multispectral imagery (MSI), hyperspectral imagery (HSI), and LiDAR were assembled by NPS for the purpose of building imagery baseline data; and to demonstrate the use of remote sensing to derive ground surface information for use in planning, conducting, and monitoring post-earthquake emergency response. Worldview-2 data were converted to reflectance, orthorectified, and mosaicked for most of Monterey County; CA. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data acquired at two spatial resolutions were atmospherically corrected and analyzed in conjunction with the MSI data. LiDAR data at point densities from 1.4 pts/m2 to over 40 points/ m2 were analyzed to determine digital surface models. The multimodal data were then used to develop change detection approaches and products and other supporting information. Analysis results from these data along with other geographic information were used to identify and generate multi-tiered products tied to the level of post-event communications infrastructure (internet access + cell, cell only, no internet/cell). Technology transfer of these capabilities to local and state emergency response organizations gives emergency responders new tools in support of post-disaster operational scenarios.

  3. 88 hours: The U.S. Geological Survey National Earthquake Information Center response to the 11 March 2011 Mw 9.0 Tohoku earthquake

    Science.gov (United States)

    Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.

    2011-01-01

    This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

  4. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    also be both specific (although allowably uncertain) and actionable. In this analysis, an attempt is made at both simple and intuitive color-coded alerting criteria; yet the necessary uncertainty measures by which one can gauge the likelihood for the alert to be over- or underestimated are preserved. The essence of the proposed impact scale and alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide on the basis of quantifiable loss estimates. Utilizing EIS, PAGER's rapid loss estimates can adequately recommend alert levels and suggest appropriate response protocols, despite the uncertainties; demanding or awaiting observations or loss estimates with a high level of accuracy may increase the losses. ?? 2011 American Society of Civil Engineers.

  5. Integrating landslide and liquefaction hazard and loss estimates with existing USGS real-time earthquake information products

    Science.gov (United States)

    Allstadt, Kate E.; Thompson, Eric M.; Hearne, Mike; Nowicki Jessee, M. Anna; Zhu, J.; Wald, David J.; Tanyas, Hakan

    2017-01-01

    The U.S. Geological Survey (USGS) has made significant progress toward the rapid estimation of shaking and shakingrelated losses through their Did You Feel It? (DYFI), ShakeMap, ShakeCast, and PAGER products. However, quantitative estimates of the extent and severity of secondary hazards (e.g., landsliding, liquefaction) are not currently included in scenarios and real-time post-earthquake products despite their significant contributions to hazard and losses for many events worldwide. We are currently running parallel global statistical models for landslides and liquefaction developed with our collaborators in testing mode, but much work remains in order to operationalize these systems. We are expanding our efforts in this area by not only improving the existing statistical models, but also by (1) exploring more sophisticated, physics-based models where feasible; (2) incorporating uncertainties; and (3) identifying and undertaking research and product development to provide useful landslide and liquefaction estimates and their uncertainties. Although our existing models use standard predictor variables that are accessible globally or regionally, including peak ground motions, topographic slope, and distance to water bodies, we continue to explore readily available proxies for rock and soil strength as well as other susceptibility terms. This work is based on the foundation of an expanding, openly available, case-history database we are compiling along with historical ShakeMaps for each event. The expected outcome of our efforts is a robust set of real-time secondary hazards products that meet the needs of a wide variety of earthquake information users. We describe the available datasets and models, developments currently underway, and anticipated products. 

  6. A rapid review of consumer health information needs and preferences.

    Science.gov (United States)

    Ramsey, Imogen; Corsini, Nadia; Peters, Micah D J; Eckert, Marion

    2017-09-01

    This rapid review summarizes best available evidence on consumers' needs and preferences for information about healthcare, with a focus on the Australian context. Three questions are addressed: 1) Where do consumers find and what platform do they use to access information about healthcare? 2) How do consumers use the healthcare information that they find? 3) About which topics or subjects do consumers need healthcare information? A hierarchical approach was adopted with evidence first sought from reviews then high quality studies using Medline (via PubMed), CINAHL, Embase, the JBI Database of Systematic Reviews and Implementation Reports, the Campbell Collaboration Library of Systematic Reviews, EPPI-Centre, and Epistemonikos. Twenty-eight articles were included; four systematic reviews, three literature reviews, thirteen quantitative studies, six qualitative studies, and two mixed methods studies. Consumers seek health information at varying times along the healthcare journey and through various modes of delivery. Complacency with historical health information modes is no longer appropriate and flexibility is essential to suit growing consumer demands. Health information should be readily available in different formats and not exclusive to any single medium. Copyright © 2017. Published by Elsevier B.V.

  7. Use of information and communication technologies in the formal and informal health system responses to the 2015 Nepal earthquakes.

    Science.gov (United States)

    Crane, Olivia; Balen, Julie; Devkota, Bhimsen; Ghimire, Sudha; Rushton, Simon

    2017-11-01

    Information and Communication Technologies (ICTs) are increasingly recognized for their potential contributions to health service delivery in Low-and Middle-Income Countries (LMICs). As well as playing a role in improving the provision of health services under everyday 'normal' circumstances, ICTs can also be important in preparing for, mitigating, responding to and recovering from disasters. This research explores the use of ICTs in a natural disaster situation in Nepal, a country affected by a series of strong earthquakes in 2015. In March and April 2016, in-depth semi-structured interviews (n = 24) and focus group discussions (n = 4) were conducted with key informants: those affected by the earthquake, and those forming part of the formal or informal health system responses. Data were collected and analysed across three levels, from the bottom 'upwards', namely: (1) village level; (2) district level and (3) central/national level. Perceptions of the role and value of ICTs varied greatly-as did patterns of use. While access and capability were found to be key barriers to use rurally, ICTs were nevertheless an important part of the informal response, helping people to gather information, express needs and cope emotionally. They also helped relief agencies in allowing for networking and coordination among actors. Use of ICTs in the formal health system response, however, was severely lacking in many areas, relying more on traditional methods of disaster management. This reflects a general deficiency in, and underuse of, ICTs in the pre-earthquake Nepali healthcare system. We conclude by calling for a redoubling of efforts to improve and increase the adoption, diffusion, integration and regular use of ICTs within the Nepali health system-an approach that will assist with day-to-day service delivery but also provide a crucial platform upon which to build during future crises. © The Author 2017. Published by Oxford University Press in association with The

  8. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  9. Incorporating Real-time Earthquake Information into Large Enrollment Natural Disaster Course Learning

    Science.gov (United States)

    Furlong, K. P.; Benz, H.; Hayes, G. P.; Villasenor, A.

    2010-12-01

    Although most would agree that the occurrence of natural disaster events such as earthquakes, volcanic eruptions, and floods can provide effective learning opportunities for natural hazards-based courses, implementing compelling materials into the large-enrollment classroom environment can be difficult. These natural hazard events derive much of their learning potential from their real-time nature, and in the modern 24/7 news-cycle where all but the most devastating events are quickly out of the public eye, the shelf life for an event is quite limited. To maximize the learning potential of these events requires that both authoritative information be available and course materials be generated as the event unfolds. Although many events such as hurricanes, flooding, and volcanic eruptions provide some precursory warnings, and thus one can prepare background materials to place the main event into context, earthquakes present a particularly confounding situation of providing no warning, but where context is critical to student learning. Attempting to implement real-time materials into large enrollment classes faces the additional hindrance of limited internet access (for students) in most lecture classrooms. In Earth 101 Natural Disasters: Hollywood vs Reality, taught as a large enrollment (150+ students) general education course at Penn State, we are collaborating with the USGS’s National Earthquake Information Center (NEIC) to develop efficient means to incorporate their real-time products into learning activities in the lecture hall environment. Over time (and numerous events) we have developed a template for presenting USGS-produced real-time information in lecture mode. The event-specific materials can be quickly incorporated and updated, along with key contextual materials, to provide students with up-to-the-minute current information. In addition, we have also developed in-class activities, such as student determination of population exposure to severe ground

  10. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  11. An exploratory study for rapid estimation of critical source parameters of great subduction-zone earthquakes in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Singh, S. K; Perez-Campos, X, Iglesias, A; Pacheco, J. F [Instituto de Geofisica, Universidad Nacional Autonoma de Mexico, Mexico D.F. (Mexico)

    2008-10-15

    The rapid and reliable estimation of moment magnitude M{sub w}, location, and size of rupture area, and radiated energy E{sub s} of great Mexican subduction zone earthquakes is critical for a quick assessment of tsunami and/or damage potential of the event and for issuing an early tsunami alert. To accomplish this goal, the Mexican broadband seismic network needs to be supplemented by permanent GPS stations along the Pacific coast, spaced about 65 km apart or less. The data from the GPS network must be transmitted to a central location and processed in near-real time to track the position of the stations. Assuming that this can be implemented, we develop a procedure for near-real time estimation of the critical source parameters. We demonstrate the viability of the procedure by processing near-source GPS data and regional seismograms for the earthquakes of Colima-Jalisco in 1995 (M{sub w}=8.0) and Sumatra-Andaman in 2004 (M{sub w}=9.0-9.3). The procedure yields estimates of M{sub w} and E{sub s} in excellent agreement with those reported from earlier solutions. In the case of the Colima-Jalisco earthquake, the estimated location and size of rupture area agree with that mapped from aftershock locations. Presently, there are 13 permanent GPS stations along the Pacific coast of Mexico with an average spacing of {approx}200 km which operate in an autonomous mode. It is urgent to increase the number of stations to {>=}28 thus decreasing the spacing of stations to {<=}65 km. Data must be transmitted in near-real time to a central station to track the position of the stations, preferably every second. [Spanish] Para una estimacion oportuna del potencial de dano y tsunami asociado a los grandes temblores de subduccion en Mexico, resulta critica la determinacion rapida y confiable de parametros sismologicos como lo son la magnitud de momento (M{sub w}), la energia sismica radiada (E{sub s}) y la localizacion y el tamano de la ruptura. Para alcanzar este objetivo, la red

  12. Incorporating indel information into phylogeny estimation for rapidly emerging pathogens

    Directory of Open Access Journals (Sweden)

    Suchard Marc A

    2007-03-01

    Full Text Available Abstract Background Phylogenies of rapidly evolving pathogens can be difficult to resolve because of the small number of substitutions that accumulate in the short times since divergence. To improve resolution of such phylogenies we propose using insertion and deletion (indel information in addition to substitution information. We accomplish this through joint estimation of alignment and phylogeny in a Bayesian framework, drawing inference using Markov chain Monte Carlo. Joint estimation of alignment and phylogeny sidesteps biases that stem from conditioning on a single alignment by taking into account the ensemble of near-optimal alignments. Results We introduce a novel Markov chain transition kernel that improves computational efficiency by proposing non-local topology rearrangements and by block sampling alignment and topology parameters. In addition, we extend our previous indel model to increase biological realism by placing indels preferentially on longer branches. We demonstrate the ability of indel information to increase phylogenetic resolution in examples drawn from within-host viral sequence samples. We also demonstrate the importance of taking alignment uncertainty into account when using such information. Finally, we show that codon-based substitution models can significantly affect alignment quality and phylogenetic inference by unrealistically forcing indels to begin and end between codons. Conclusion These results indicate that indel information can improve phylogenetic resolution of recently diverged pathogens and that alignment uncertainty should be considered in such analyses.

  13. Real-time damage estimations of 2016 Kumamoto earthquakes extrapolated by the Japan Real-time Information System for earthquake (J-RISQ)

    Science.gov (United States)

    Shohei, N.; Nakamura, H.; Takahashi, I.; Fujiwara, H.

    2016-12-01

    It is crucial to develop methods grasping the situation soon after the earthquake, both in terms of supporting initial reactions, and enhancing social systems more resilient. For those reasons, we have been developing J-RISQ. Promptly after an earthquake, it estimates damages by combining methods for predicting ground motion using subsurface data, information about population and buildings, damage assessment methods for building using different fragility functions, and real-time observation data obtained by NIED, municipalities and JMA. In this study, we describe about estimations of 2016 Kumamoto earthquakes extrapolated by J-RISQ. In 2016, Kumamoto have faced 2 large jolts, the foreshock (M6.5) occurred on April 14, the main shock (M7.3) came on April 16. J-RISQ published a first report in 29 seconds after the foreshock and generated a total of seven reports within 10 minutes. Finally, it estimated that the number of completely collapsed buildings was between 5,000 and 14,000. In case of the main shock, a first report in 29 seconds, then 8 reports within 11 minutes. Finally, estimated numbers of completely collapsed buildings was between 15,000 and 38,000. The count of completely collapsed residences is approximately 8,300 according to the announcement by FDMA at July 19. In this regard, J-RISQ seems to be overestimated, however, the spatial distribution of estimation indicates a belt of destructive area adjacent to Mashiki town, and this result is correspond approximately to actual damaged area. For verification, we have performed field investigations of building damage in Kumamoto. On the other hand, the damage after the main shock includes the effect of the foreshock, so we are going to develop estimation methods considering about reduction of building caused by continuous earthquakes. *This work was supported by the CSTI through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural

  14. Generalization of prior information for rapid Bayesian time estimation.

    Science.gov (United States)

    Roach, Neil W; McGraw, Paul V; Whitaker, David J; Heron, James

    2017-01-10

    To enable effective interaction with the environment, the brain combines noisy sensory information with expectations based on prior experience. There is ample evidence showing that humans can learn statistical regularities in sensory input and exploit this knowledge to improve perceptual decisions and actions. However, fundamental questions remain regarding how priors are learned and how they generalize to different sensory and behavioral contexts. In principle, maintaining a large set of highly specific priors may be inefficient and restrict the speed at which expectations can be formed and updated in response to changes in the environment. However, priors formed by generalizing across varying contexts may not be accurate. Here, we exploit rapidly induced contextual biases in duration reproduction to reveal how these competing demands are resolved during the early stages of prior acquisition. We show that observers initially form a single prior by generalizing across duration distributions coupled with distinct sensory signals. In contrast, they form multiple priors if distributions are coupled with distinct motor outputs. Together, our findings suggest that rapid prior acquisition is facilitated by generalization across experiences of different sensory inputs but organized according to how that sensory information is acted on.

  15. Tsunami early warning using earthquake rupture duration (Invited)

    Science.gov (United States)

    Michelini, A.; Lomax, A.

    2009-12-01

    Real-time seismological data collected through the International Monitoring System and acquired at the International Data Center can be used for civil applications including rapid earthquake location and magnitude determination, and tsunami early warning. Effective tsunami early warning for coastlines near a tsunamigenic earthquake requires notification within 5-15 minutes. We have shown recently that the high-frequency, apparent rupture duration, T0, allows rapid estimation of moment magnitude, Mw, for large earthquakes, that tsunamigenic earthquakes have T0 greater than about 50 s, and that T0 gives more information on tsunami importance than moment magnitude, Mw. A “duration-exceedance” procedure using seismograms recorded near an earthquake to rapidly determine if T0 is likely to exceed 50 s correctly identifies within about 10 min after the earthquake occurrence most recent earthquakes which produced large or devastating tsunamis. This identification forms a complement to initial estimates of the location, depth and magnitude of an earthquake to improve the reliability of tsunami early warning, and, in some cases, may make possible such warning. The IMS global seismological network appears well suited for rapid determination of all of these earthquake parameters.

  16. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure

    Science.gov (United States)

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel

    2017-01-01

    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for

  17. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  18. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  19. An evaluation of Health of the Nation Outcome Scales data to inform psychiatric morbidity following the Canterbury earthquakes.

    Science.gov (United States)

    Beaglehole, Ben; Frampton, Chris M; Boden, Joseph M; Mulder, Roger T; Bell, Caroline J

    2017-11-01

    Following the onset of the Canterbury, New Zealand earthquakes, there were widespread concerns that mental health services were under severe strain as a result of adverse consequences on mental health. We therefore examined Health of the Nation Outcome Scales data to see whether this could inform our understanding of the impact of the Canterbury earthquakes on patients attending local specialist mental health services. Health of the Nation Outcome Scales admission data were analysed for Canterbury mental health services prior to and following the Canterbury earthquakes. These findings were compared to Health of the Nation Outcome Scales admission data from seven other large District Health Boards to delineate local from national trends. Percentage changes in admission numbers were also calculated before and after the earthquakes for Canterbury and the seven other large district health boards. Admission Health of the Nation Outcome Scales scores in Canterbury increased after the earthquakes for adult inpatient and community services, old age inpatient and community services, and Child and Adolescent inpatient services compared to the seven other large district health boards. Admission Health of the Nation Outcome Scales scores for Child and Adolescent community services did not change significantly, while admission Health of the Nation Outcome Scales scores for Alcohol and Drug services in Canterbury fell compared to other large district health boards. Subscale analysis showed that the majority of Health of the Nation Outcome Scales subscales contributed to the overall increases found. Percentage changes in admission numbers for the Canterbury District Health Board and the seven other large district health boards before and after the earthquakes were largely comparable with the exception of admissions to inpatient services for the group aged 4-17 years which showed a large increase. The Canterbury earthquakes were followed by an increase in Health of the Nation

  20. Risk Informed Design Using Integrated Vehicle Rapid Assessment Tools Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A successful proof of concept was performed in FY 2012 integrating the Envision tool for parametric estimates of vehicle mass and the Rapid Response Risk Assessment...

  1. Rapid Estimation of Macroseismic Intensity for On-site Earthquake Early Warning in Italy from Early Radiated Energ

    Science.gov (United States)

    Emolo, A.; Zollo, A.; Brondi, P.; Picozzi, M.; Mucciarelli, M.

    2015-12-01

    Earthquake Early Warning System (EEWS) are effective tools for the risk mitigation in active seismic regions. Recently, a feasibility study of a nation-wide earthquake early warning systems has been conducted for Italy considering the RAN Network and the EEW software platform PRESTo. This work showed that a reliable estimations in terms of magnitude and epicentral localization would be available within 3-4 seconds after the first P-wave arrival. On the other hand, given the RAN's density, a regional EEWS approach would result in a Blind Zone (BZ) of 25-30 km in average. Such BZ dimension would provide lead-times greater than zero only for events having magnitude larger than 6.5. Considering that in Italy also smaller events are capable of generating great losses both in human and economic terms, as dramatically experienced during the recent 2009 L'Aquila (ML 5.9) and 2012 Emilia (ML 5.9) earthquakes, it has become urgent to develop and test on-site approaches. The present study is focused on the development of a new on-site EEW metodology for the estimation of the macroseismic intensity at a target site or area. In this analysis we have used a few thousands of accelerometric traces recorded by RAN related to the largest earthquakes (ML>4) occurred in Italy in the period 1997-2013. The work is focused on the integral EW parameter Squared Velocity Integral (IV2) and on its capability to predict the peak ground velocity PGV and the Housner Intensity IH, as well as from these latters we parameterized a new relation between IV2 and the Macroseismic Intensity. To assess the performance of the developed on-site EEW relation, we used data of the largest events occurred in Italy in the last 6 years recorded by the Osservatorio Sismico delle Strutture, as well as on the recordings of the moderate earthquake reported by INGV Strong Motion Data. The results shows that the macroseismic intensity values predicted by IV2 and the one estimated by PGV and IH are in good agreement.

  2. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  3. New information on earthquake history of the Aksehir-Afyon Graben System, Turkey, since the second half of 18th century

    Science.gov (United States)

    Ozer, N.

    2006-12-01

    Researches aimed at enriching the number of available documentary sources on earthquakes have an important role in seismology. To this end, this paper documents the history of prominent earthquakes associated with the NW-SE trending Sultandag-Aksehir Fault and Aksehir-Afyon graben system in Western-Central Anatolia since the historical times through 1766. This work also combines the earthquake data for both historical and instrumental periods, previously listed in various catalogues and resources, for the studied area. Documents from the Ottoman archives and libraries as well as the Ottoman and Turkish newspapers were scrutinized, and eight previously unreported earthquakes in the latter half of the nineteenth century and four new earthquakes in the period 1900-1931 were revealed. For the period from 1766 to 1931, the total number of known earthquakes for the area under investigation increased from eighteen to thirty thanks to the document search. Furthermore, the existing information on eleven previously reported earthquakes is updated for the period from 1862 to 1946. Earthquakes from 1946 to 1964 are compiled from the catalogues for data completeness.

  4. Earthquake induced rapid landslide in Nikawa, Nishinomiya-city, Japan; Nishinomiyashi Nikawa de hasseishita jishinji kosoku jisuberi

    Energy Technology Data Exchange (ETDEWEB)

    Sassa, K. [Kyoto Univ., Kyoto (Japan). Disaster Prevention Research Inst.

    1996-02-01

    A landslide occurred in the Nikawa area of Nishinomiya-city when the Southern Hyogo Earthquake shook on the 17th of January, 1995 was estimated to affect 110,000-120,000m{sup 3} in soil volume, not so much in scale, but became a big disaster in which 11 houses were smashed and 34 people were killed. Regarding the landslide of this area, its moving speed was estimated to be several m/sec in average, although the exact figure is unknown. Kyoto University photographed from the air the landslide site of the Nikawa area on the 21st of January and also collected various kinds of earth clods at the landslide site. The sampled soil was a bluish gray sandy soil mixed with clay and is distributed widely in the landslide area. The collected samples were tested with a ring shearing tester filled with the water after saturating the samples. The discovered causes of the landslide are that the gradient of the slope was about 20{degree} and its safety factor was high, nevertheless due to the earthquake force, the slope soil layers showed shearing fractures, the saturated water basin layers existed in the slope soil layers despite the dry season, and the decrease of strength of the earth and soil of Nikawa was big when saturated. 5 refs., 7 figs.

  5. Condensation of earthquake location distributions: Optimal spatial information encoding and application to multifractal analysis of south Californian seismicity.

    Science.gov (United States)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2015-08-01

    We present the "condensation" method that exploits the heterogeneity of the probability distribution functions (PDFs) of event locations to improve the spatial information content of seismic catalogs. As its name indicates, the condensation method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog differs from the initial catalog by attributing different weights to each event, the set of weights providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog, which is quantified by the likelihood gain per event. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ∼25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5 km) might depend on the relocation procedure. Accounting for these new results, the epidemic type aftershock model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used

  6. A framework for rapid post-earthquake assessment of bridges and restoration of transportation network functionality using structural health monitoring

    Science.gov (United States)

    Omenzetter, Piotr; Ramhormozian, Shahab; Mangabhai, Poonam; Singh, Ravikash; Orense, Rolando

    2013-04-01

    Quick and reliable assessment of the condition of bridges in a transportation network after an earthquake can greatly assist immediate post-disaster response and long-term recovery. However, experience shows that available resources, such as qualified inspectors and engineers, will typically be stretched for such tasks. Structural health monitoring (SHM) systems can therefore make a real difference in this context. SHM, however, needs to be deployed in a strategic manner and integrated into the overall disaster response plans and actions to maximize its benefits. This study presents, in its first part, a framework of how this can be achieved. Since it will not be feasible, or indeed necessary, to use SHM on every bridge, it is necessary to prioritize bridges within individual networks for SHM deployment. A methodology for such prioritization based on structural and geotechnical seismic risks affecting bridges and their importance within a network is proposed in the second part. An example using the methodology application to selected bridges in the medium-sized transportation network of Wellington, New Zealand is provided. The third part of the paper is concerned with using monitoring data for quick assessment of bridge condition and damage after an earthquake. Depending on the bridge risk profile, it is envisaged that data will be obtained from either local or national seismic monitoring arrays or SHM systems installed on bridges. A method using artificial neural networks is proposed for using data from a seismic array to infer key ground motion parameters at an arbitrary bridges site. The methodology is applied to seismic data collected in Christchurch, New Zealand. Finally, how such ground motion parameters can be used in bridge damage and condition assessment is outlined.

  7. The INGVterremoti blog: a new communication tool to improve earthquake information during the Po Plain seismic sequence

    Directory of Open Access Journals (Sweden)

    Maurizio Pignone

    2012-10-01

    Full Text Available During a seismic sequence, it is extremely important that the population shaken by the earthquakes has continuous and timely information about the ongoing seismic activity. For this reason, a few days after the beginning of the Po Plain sequence, we opened a new informative channel, namely a blog (http://ingvterremoti.wordpress.com, through which we released tens of updates and in-depth scientific explanations on the earthquake sequence, as well as the classical information provided on the INGV web sites. Through the collaboration of Istituto Nazionale di Geofisica e Vulcanologia (INGV researchers and technicians, we published more than 80 posts about the sequence, to describe ongoing activities during the emergency, and the first scientific results obtained from preliminary data processing. The response of the public was immediately encouraging, and reached its maximum on June 3, 2012, when a Ml 5.1 earthquake was strongly felt in the region previously shaken by the May 20, 2012, Ml 5.9 and the May 29, 2012, Ml 5.8 events. We recorded more than 850,000 views on that particular day, and had more than six million contacts (6,403,843 in the first two months (May 29 to July 29, mostly concentrated in the first 15 days. Even if the number of contacts is surely the main indicator of the effectiveness of a blog, it is important to carry out an in-depth analysis of the audience, to have useful material to further improve this blog. A broader research project could also include cross-media comparisons, taking into account the whole set of social media used by the INGV (YouTube, Twitter, Facebook, Flickr, Apple. Future developments include the creation of new categories and new articles for other regions of Italy, and in-depth scientific articles to explain the results of the research activities. We also plan to find a way to implement the interactions with the public, for answering questions and making comments. The INGVterremoti blog is an original

  8. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  9. Reviewing information support during the Great East Japan Earthquake disaster : From the perspective of a hospital library that received support

    Science.gov (United States)

    Terasawa, Motoko

    The Great East Japan Earthquake of March 11, 2011 caused extensive damage over a widespread area. Our hospital library, which is located in the affected area, was no exception. A large collection of books was lost, and some web content was inaccessible due to damage to the network environment. This greatly hindered our efforts to continue providing post-disaster medical information services. Information support, such as free access to databases, journals, and other online content related to the disaster areas, helped us immensely during this time. We were fortunate to have the cooperation of various medical employees and library members via social networks, such as twitter, during the process of attaining this information support.

  10. Synthesis of body-wave information from global earthquake coda correlation: A numerical evaluation

    Science.gov (United States)

    Huang, H. H.; Tsai, V. C.; Lin, F. C.; Wang, W.; Chaput, J. A.

    2016-12-01

    Retrieval of body waves from noise or coda correlations that provides sampling at deeper depth compared to laterally propagating surface waves has advanced our ability of probing and monitoring the deep Earth. In contrast to the fruitful discussion on the accuracy and limitations for surface wave retrieval, the understanding of successful retrieval of body waves and their possible bias is relatively limited but of great importance for further applications. Here, using a numerical approach with the actual configuration of global large earthquakes and USArray stations, we validate the path sensitivity of recently-reported core phases (e.g. PKIKP2) and examine various parameters such as source duration, source distribution, and Q structure that may affect travel time and spectral characteristics of retrieved body-wave signals. Simulation results based on 1-D models show that, at least for earthquake coda correlations, using only reverberations (without scattering) can replicate most of body-wave signals in correlation functions as those from real observations. The observed lower frequency content of coda correlations compared to that of noise correlations shown in previous studies is likely caused by the long source time function of large earthquakes. Furthermore, the extracted travel times in autocorrelation functions (zero-offset) are relatively robust with insignificant bias as long as the selected coda windows are sufficiently late (after 10000 s); on the other hand, travel times in cross-correlation functions are biased and require careful window selection based upon azimuth and epicenter distance criteria. With a ray-based analytical model, we can explain the majority of such bias and show how the stationary phase approximation could be fulfilled by (ballistic) multiples that reverberate at different times in the coda and how the spurious phases emerge from the cross-terms of multiples.

  11. Earthquake science in resilient societies

    Science.gov (United States)

    Stahl, T.; Clark, M. K.; Zekkos, D.; Athanasopoulos-Zekkos, A.; Willis, M.; Medwedeff, William; Knoper, Logan; Townsend, Kirk; Jin, Jonson

    2017-04-01

    Earthquake science is critical in reducing vulnerability to a broad range of seismic hazards. Evidence-based studies drawing from several branches of the Earth sciences and engineering can effectively mitigate losses experienced in earthquakes. Societies that invest in this research have lower fatality rates in earthquakes and can recover more rapidly. This commentary explores the scientific pathways through which earthquake-resilient societies are developed. We highlight recent case studies of evidence-based decision making and how modern research is improving the way societies respond to earthquakes.

  12. An atlas of ShakeMaps for selected global earthquakes

    Science.gov (United States)

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul S.; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  13. Information entropy of earthquake populations in northeastern Italy and western Slovenia

    Science.gov (United States)

    Bressan, G.; Barnaba, C.; Gentili, S.; Rossi, G.

    2017-10-01

    The spatio-temporal evolution of eight seismicity populations, preceding and following moderate earthquake sequences occurred in NE-Italy and W-Slovenia, are investigated by means of the normalized Shannon entropy and the fractal dimension. Three phases are recognized in the temporal seismic series. The period preceding the mainshock is characterized by oscillations of the Shannon entropy around a nearly constant level and by fluctuations of the fractal dimension. The phase of mainshock and aftershock sequences is characterized by a significant decrease of the Shannon entropy. A simultaneous marked decrease of the fractal dimension is observed in five cases. After the sequence, the entropy recovers the nearly constant trend before the mainshock and the fractal dimension is characterized by fluctuations. We interpreted the fluctuations of the normalized Shannon entropy and the fractal dimension caused by the coupling between the stress field and the mechanical heterogeneities of the crust that results in spatial and temporal fluctuations of the strain energy.

  14. Tectonic summaries of magnitude 7 and greater earthquakes from 2000 to 2015

    Science.gov (United States)

    Hayes, Gavin P.; Meyers, Emma K.; Dewey, James W.; Briggs, Richard W.; Earle, Paul S.; Benz, Harley M.; Smoczyk, Gregory M.; Flamme, Hanna E.; Barnhart, William D.; Gold, Ryan D.; Furlong, Kevin P.

    2017-01-11

    This paper describes the tectonic summaries for all magnitude 7 and larger earthquakes in the period 2000–2015, as produced by the U.S. Geological Survey National Earthquake Information Center during their routine response operations to global earthquakes. The goal of such summaries is to provide important event-specific information to the public rapidly and concisely, such that recent earthquakes can be understood within a global and regional seismotectonic framework. We compile these summaries here to provide a long-term archive for this information, and so that the variability in tectonic setting and earthquake history from region to region, and sometimes within a given region, can be more clearly understood.

  15. Remote Sensing and Geographic Information Systems (GIS Contribution to the Inventory of Infrastructure Susceptible to Earthquake and Flooding Hazards in North-Eastern Greece

    Directory of Open Access Journals (Sweden)

    Ioanna Papadopoulou

    2012-09-01

    Full Text Available For civil protection reasons there is a strong need to improve the inventory of areas that are more vulnerable to earthquake ground motions or to earthquake-related secondary effects, such as landslides, liquefaction or soil amplifications. The use of remote sensing and Geographic Information Systems (GIS methods along with the related geo-databases can assist local and national authorities to be better prepared and organized. Remote sensing and GIS techniques are investigated in north-eastern Greece in order to contribute to the systematic, standardized inventory of those areas that are more susceptible to earthquake ground motions, to earthquake-related secondary effects and to tsunami-waves. Knowing areas with aggregated occurrence of causal (“negative” factors influencing earthquake shock and, thus, the damage intensity, this knowledge can be integrated into disaster preparedness and mitigation measurements. The evaluation of satellite imageries, digital topographic data and open source geodata contributes to the acquisition of the specific tectonic, geologic and geomorphologic settings influencing local site conditions in an area and, thus, estimate possible damage to be suffered.

  16. Prioritizing earthquake and tsunami alerting efforts

    Science.gov (United States)

    Allen, R. M.; Allen, S.; Aranha, M. A.; Chung, A. I.; Hellweg, M.; Henson, I. H.; Melgar, D.; Neuhauser, D. S.; Nof, R. N.; Strauss, J. A.

    2015-12-01

    The timeline of hazards associated with earthquakes ranges from seconds for the strong shaking at the epicenter, to minutes for strong shaking at more distant locations in big quakes, to tens of minutes for a local tsunami. Earthquake and tsunami warning systems must therefore include very fast initial alerts, while also taking advantage of available time in bigger and tsunami-generating quakes. At the UC Berkeley Seismological Laboratory we are developing a suite of algorithms to provide the fullest possible information about earthquake shaking and tsunami inundation from seconds to minutes after a quake. The E-larmS algorithm uses the P-wave to rapidly detect an earthquake and issue a warning. It is currently issuing alerts to test users in as little as 3 sec after the origin time. Development of a new waveform detector may lead to even faster alerts. G-larmS uses permanent deformation estimates from GNSS stations to estimate the geometry and extent of rupture underway providing more accurate ground shaking estimates in big (M>~7) earthquakes. It performed well in the M6.0 2014 Napa earthquake. T-larmS is a new algorithm designed to extend alert capabilities to tsunami inundation. Rapid estimates of source characteristics for subduction zones event can not only be used to warn of the shaking hazard, but also the local tsunami inundation hazard. These algorithms are being developed, implemented and tested with a focus on the western US, but are also now being tested in other parts of the world including Israel, Turkey, Korea and Chile. Beta users in the Bay Area are receiving the alerts and beginning to implement automated actions. They also provide feedback on users needs, which has led to the development of the MyEEW smartphone app. This app allows beta users to receive the alerts on their cell phones. All these efforts feed into our ongoing assessment of directions and priorities for future development and implementation efforts.

  17. Undergraduate Studies in Earthquake Information Technology (UseIT): Preparing Students for the Twenty-First Century Work Force via a Multidisciplinary and Collaborative Learning Experience

    Science.gov (United States)

    Degroot, R. M.; Jordan, T. H.; Benthien, M. L.; Ihrig, M.; Berti, R.

    2009-12-01

    UseIT is one of the three undergraduate research programs sponsored by the Southern California Earthquake Center (SCEC). The program allows students to work in multi-disciplinary collaborative teams to tackle a scientific “Grand Challenge.” The topic varies each year but it always entails performing computer science research that is needed by earthquake scientists, educators, and other target audiences. The program allows undergraduates to use the advanced tools of information technology to solve important problems in interdisciplinary earthquake research. Since the program began in 2002, 145 students have participated in UseIT. The program stresses problem solving and interdisciplinary cross training. A key aspect of the UseIT program is its flexible, yet structured, team approach. Students share their diverse skills and interests, creating a powerful synergy through this peer mentoring. The majority of UseIT interns have considerable computer science skill or aptitude, but successful UseIT interns have hailed from nearly three-dozen disciplines, all class levels, and all skill levels. Successful UseIT interns have in common a willingness to step outside their comfort zones and try new things. During the 2009 internship the focus of the program was to deliver SCEC Virtual Display of Objects (VDO) images and animations of faults and earthquake sequences to SCEC, the Earthquake Country Alliance, and other virtual organizations via a content management system that captures the metadata and guides the user. SCEC-VDO is the SCEC intern-developed visualization software that allows the user to see earthquake related phenomena in three and four dimensions. The 2009 Grand Challenge had special relevance for the interns because the products they created were used for The Great California ShakeOut. This talk will discuss lessons learned from this program, how it addresses the needs of the 21st century STEM work force, and highlights of the 2009 internship.

  18. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard

    Science.gov (United States)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.

    2016-12-01

    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  19. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  20. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China

    National Research Council Canada - National Science Library

    Shasha Li; Guofang Zhai; Shutian Zhou; Chenjing Fan; Yunqing Wu; Chongqiang Ren

    2017-01-01

    .... Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants...

  1. Earthquake engineering in Peru

    Science.gov (United States)

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  2. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Background Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. Methods We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey’s ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case – an 8.4 magnitude earthquake that hit southern Peru in 2001. Results and conclusions Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post- earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters. PMID:26090999

  3. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  4. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    Science.gov (United States)

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  5. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    Science.gov (United States)

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  6. Progress towards Rapid Detection of Measles Vaccine Strains: a Tool To Inform Public Health Interventions.

    Science.gov (United States)

    Hacker, Jill K

    2017-03-01

    Rapid differentiation of vaccine from wild-type strains in suspect measles cases is a valuable epidemiological tool that informs the public health response to this highly infectious disease. Few public health laboratories sequence measles virus-positive specimens to determine genotype, and the vaccine-specific real-time reverse transcriptase PCR (rRT-PCR) assay described by F. Roy et al. (J. Clin. Microbiol. 55:735-743, 2017, https://doi.org/10.1128/JCM.01879-16) offers a rapid, easily adoptable method to identify measles vaccine strains in suspect cases. Copyright © 2017 American Society for Microbiology.

  7. Rapid Detection of Land Cover Changes Using Crowdsourced Geographic Information: A Case Study of Beijing, China

    Directory of Open Access Journals (Sweden)

    Yuan Meng

    2017-08-01

    Full Text Available Land cover change (LCC detection is a significant component of sustainability research including ecological economics and climate change. Due to the rapid variability of natural environment, effective LCC detection is required to capture sufficient change-related information. Although such information has been available through remotely sensed images, the complicated image processing and classification make it time consuming and labour intensive. In contrast, the freely available crowdsourced geographic information (CGI contains easily interpreted textual information, and thus has the potential to be applied for capturing effective change-related information. Therefore, this paper presents and evaluates a method using CGI for rapid LCC detection. As a case study, Beijing is chosen as the study area, and CGI is applied to monitor LCC information. As one kind of CGI which is generated from commercial Internet maps, points of interest (POIs with detailed textual information are utilised to detect land cover in 2016. Those POIs are first classified into land cover nomenclature based on their textual information. Then, a kernel density approach is proposed to effectively generate land cover regions in 2016. Finally, with GlobeLand30 in 2010 as baseline map, LCC is detected using the post-classification method in the period of 2010–2016 in Beijing. The result shows that an accuracy of 89.20% is achieved with land cover regions generated by POIs, indicating that POIs are reliable for rapid LCC detection. Additionally, an LCC detection comparison is proposed between remotely sensed images and CGI, revealing the advantages of POIs in terms of LCC efficiency. However, due to the uneven distribution, remotely sensed images are still required in areas with few POIs.

  8. Social media as an information source for rapid flood inundation mapping

    Science.gov (United States)

    Fohringer, J.; Dransch, D.; Kreibich, H.; Schröter, K.

    2015-12-01

    During and shortly after a disaster, data about the hazard and its consequences are scarce and not readily available. Information provided by eyewitnesses via social media is a valuable information source, which should be explored in a~more effective way. This research proposes a methodology that leverages social media content to support rapid inundation mapping, including inundation extent and water depth in the case of floods. The novelty of this approach is the utilization of quantitative data that are derived from photos from eyewitnesses extracted from social media posts and their integration with established data. Due to the rapid availability of these posts compared to traditional data sources such as remote sensing data, areas affected by a flood, for example, can be determined quickly. The challenge is to filter the large number of posts to a manageable amount of potentially useful inundation-related information, as well as to interpret and integrate the posts into mapping procedures in a timely manner. To support rapid inundation mapping we propose a methodology and develop "PostDistiller", a tool to filter geolocated posts from social media services which include links to photos. This spatial distributed contextualized in situ information is further explored manually. In an application case study during the June 2013 flood in central Europe we evaluate the utilization of this approach to infer spatial flood patterns and inundation depths in the city of Dresden.

  9. Rapid scenarios and observed intensities

    OpenAIRE

    Franco Pettenati; Livio Sirovich

    2012-01-01

    After a destructive earthquake, national Governments need to know the approximate amount of damage, the number of casualties, and the financial losses as soon as possible. Rapid scenarios are also used to inform the general public; see the widely used Shakemap package [Wald et al. 1999, 2006] of the US Geological Survey (USGS) and the one modified by the Istituto Nazionale di Geofisica e Vulcanologia (INGV; National Institute of Geophysics and Volcanology), which is reproduced for Figure 1. T...

  10. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  11. Chance findings about early holocene tidal marshes of Grays Harbor, Washington, in relation to rapidly rising seas and great subduction earthquakes

    Science.gov (United States)

    Phipps, James B.; Hemphill-Haley, Eileen; Atwater, Brian F.

    2015-06-18

    Tidal marshes commonly build upward apace with gradual rise in the level of the sea. It is expected, however, that few tidal marshes will keep up with accelerated sea-level rise later in this century. Tidal marshes have been drowned, moreover, after subsiding during earthquakes.

  12. On the involvement of the Citizens for an improved earthquake response

    Science.gov (United States)

    Bossu, R.; Gilles, S.; Roussel, F.

    2009-04-01

    Because of their key role in earthquake response, the Civil Protection Agencies are more interested in the effects of an earthquake rather than on the phenomenon itself. An earthquake effects are not easy to predict from seismological data only. Firstly because the many elements required to estimate these effects are either affected by non-negligible uncertainties (e.g. earthquake depth and location, site effects, seismic attenuation) and/or poorly constrained (e.g. building vulnerability, distribution of the population with time…). Secondly, because the reaction of the population is not always easy to predict especially in areas of low seismic hazard where there is little or no experience of felt earthquake. In order to complement its services of rapid earthquake information, the Euro-Med Seismological Centre (EMSC) developed a number of tools to rapidly collect in-situ observations of the earthquake effects and better evaluate the reaction of the population. The first tool is an original development by EMSC. It uses the observed surge of traffic on EMSC web site (www.emsc-csem.org) to rapidly (within 5 to 10 minutes of the earthquake's occurrence) map the area where an earthquake was felt and it determines whether there has been significant widespread damage. When an earthquake is felt, people rush on the Internet to find out the cause of the shaking generating brutal surge of traffic on our web site. The area where the earthquake was felt is determined by locating the IP addresses and identifying the localities which exhibit a significant increase of visitors. Damaged areas are characterised by a lack or an absence of connections. This approach, which is the fastest way to collect in-situ observations on earthquake effects, is being implemented in several institutes in Europe. Online macroseismic questionnaires in more than 20 languages complement this first approach. It provides a refined description of the effects and shaking levels through a quantitative

  13. The HayWired earthquake scenario—Earthquake hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-01-01

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  14. Toward automated directivity estimates in earthquake moment tensor inversion

    OpenAIRE

    Huang, Hsin-Hua; Aso, Naofumi; Tsai, Victor C.

    2017-01-01

    Rapid estimates of earthquake rupture properties are useful for both scientific characterization of earthquakes and emergency response to earthquake hazards. Rupture directivity is a particularly important property to constrain since seismic waves radiated in the direction of rupture can be greatly amplified, and even moderate magnitude earthquakes can sometimes cause serious damage. Knowing the directivity of earthquakes is important for ground shaking prediction and hazard mitigation, and i...

  15. Exploring the Use of Historic Earthquake Information to Differentiate Between Deposit Triggers for the High-resolution Stratigraphy from Squaw Lakes, Oregon, USA

    Science.gov (United States)

    Morey, A. E.; Gavin, D. G.; Goldfinger, C.; Nelson, A. R.

    2014-12-01

    The unique setting and high-resolution stratigraphy at Squaw Lakes, Oregon provides an opportunity to apply lake paleoseismology to southern Cascadia forearc lakes. These lakes were formed when a landslide dammed Squaw Creek located ~100 km from the Oregon coast at the Oregon/California border separating the drainages at the confluence of Squaw and Slickear Creeks. The upper lake contains evidence of disturbance events much too frequent to be the result of earthquakes alone. A link to historic events provides information that may be used to differentiate between deposit triggers and improve the interpretation of the prehistoric portion of the sedimentary record. Regional newspapers published historic accounts of earthquakes experienced by the local people, the most notable of which is the November 23 (or 22nd), 1873 Crescent City, CA earthquake. Although the 1906 San Francisco earthquake was also felt in this region, reports indicate that shaking was much stronger near Jacksonville, Oregon (only 25 miles to the north of Squaw Lakes) as a result of the 1873 earthquake. The depth range that most likely contains sediment deposited within a few years of 1873 can be determined using a new high-resolution age model for the Upper Squaw Lake sediment core (Gavin et al., in prep). This depth range in the core contains a thick deposit that is similar in structure to deposits deeper in the core that have been proposed to correlate with the marine record of Cascadia great earthquakes. These disturbance event deposits are thicker, graded deposits, where grading is dominated by the percentage of organic content as compared to those interpreted to be a result of watershed disturbances. Recently acquired radiocarbon ages for the Lower Squaw Lake core suggests the thicker Upper Squaw Lake deposits correlate to those recorded in the lower-resolution sedimentary record at Lower Squaw Lake. The character of the likely contemporaneous deposits from the lower lake show grading more

  16. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  17. Analog earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, R.B. [Center for Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  18. Methodology to determine the parameters of historical earthquakes in China

    Science.gov (United States)

    Wang, Jian; Lin, Guoliang; Zhang, Zhe

    2017-12-01

    China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

  19. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    Science.gov (United States)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  20. Mitigating the consequences of future earthquakes in historical centres: what perspectives from the joined use of past information and geological-geophysical surveys?

    Science.gov (United States)

    Terenzio Gizzi, Fabrizio; Moscatelli, Massimiliano; Potenza, Maria Rosaria; Zotta, Cinzia; Simionato, Maurizio; Pileggi, Domenico; Castenetto, Sergio

    2015-04-01

    To mitigate the damage effects of earthquakes in urban areas and particularly in historical centres prone to high seismic hazard is an important task to be pursued. As a matter of fact, seismic history throughout the world informs us that earthquakes have caused deep changes in the ancient urban conglomerations due to their high building vulnerability. Furthermore, some quarters can be exposed to an increase of seismic actions if compared with adjacent areas due to the geological and/or topographical features of the site on which the historical centres lie. Usually, the strategies aimed to estimate the local seismic hazard make only use of the geological-geophysical surveys. Thorough this approach we do not draw any lesson from what happened as a consequences of past earthquakes. With this in mind, we present the results of a joined use of historical data and traditional geological-geophysical approach to analyse the effects of possible future earthquakes in historical centres. The research activity discussed here is arranged into a joint collaboration between the Department of Civil Protection of the Presidency of Council of Ministers, the Institute of Environmental Geology and Geoengineering and the Institute of Archaeological and Monumental Heritage of the National (Italian) Research Council. In order to show the results, we discuss the preliminary achievements of the integrated study carried out on two historical towns located in Southern Apennines, a portion of the Italian peninsula exposed to high seismic hazard. Taking advantage from these two test sites, we also discuss some methodological implications that could be taken as a reference in the seismic microzonation studies.

  1. Rapid and highly informative diagnostic assay for H5N1 influenza viruses.

    Directory of Open Access Journals (Sweden)

    Nader Pourmand

    Full Text Available A highly discriminative and information-rich diagnostic assay for H5N1 avian influenza would meet immediate patient care needs and provide valuable information for public health interventions, e.g., tracking of new and more dangerous variants by geographic area as well as avian-to-human or human-to-human transmission. In the present study, we have designed a rapid assay based on multilocus nucleic acid sequencing that focuses on the biologically significant regions of the H5N1 hemagglutinin gene. This allows the prediction of viral strain, clade, receptor binding properties, low- or high-pathogenicity cleavage site and glycosylation status. H5 HA genes were selected from nine known high-pathogenicity avian influenza subtype H5N1 viruses, based on their diversity in biologically significant regions of hemagglutinin and/or their ability to cause infection in humans. We devised a consensus pre-programmed pyrosequencing strategy, which may be used as a faster, more accurate alternative to de novo sequencing. The available data suggest that the assay described here is a reliable, rapid, information-rich and cost-effective approach for definitive diagnosis of H5N1 avian influenza. Knowledge of the predicted functional sequences of the HA will enhance H5N1 avian influenza surveillance efforts.

  2. Episodic Memory Retrieval Functionally Relies on Very Rapid Reactivation of Sensory Information.

    Science.gov (United States)

    Waldhauser, Gerd T; Braun, Verena; Hanslmayr, Simon

    2016-01-06

    Episodic memory retrieval is assumed to rely on the rapid reactivation of sensory information that was present during encoding, a process termed "ecphory." We investigated the functional relevance of this scarcely understood process in two experiments in human participants. We presented stimuli to the left or right of fixation at encoding, followed by an episodic memory test with centrally presented retrieval cues. This allowed us to track the reactivation of lateralized sensory memory traces during retrieval. Successful episodic retrieval led to a very early (∼100-200 ms) reactivation of lateralized alpha/beta (10-25 Hz) electroencephalographic (EEG) power decreases in the visual cortex contralateral to the visual field at encoding. Applying rhythmic transcranial magnetic stimulation to interfere with early retrieval processing in the visual cortex led to decreased episodic memory performance specifically for items encoded in the visual field contralateral to the site of stimulation. These results demonstrate, for the first time, that episodic memory functionally relies on very rapid reactivation of sensory information. Remembering personal experiences requires a "mental time travel" to revisit sensory information perceived in the past. This process is typically described as a controlled, relatively slow process. However, by using electroencephalography to measure neural activity with a high time resolution, we show that such episodic retrieval entails a very rapid reactivation of sensory brain areas. Using transcranial magnetic stimulation to alter brain function during retrieval revealed that this early sensory reactivation is causally relevant for conscious remembering. These results give first neural evidence for a functional, preconscious component of episodic remembering. This provides new insight into the nature of human memory and may help in the understanding of psychiatric conditions that involve the automatic intrusion of unwanted memories. Copyright

  3. Thermal Infrared Anomalies of Several Strong Earthquakes

    OpenAIRE

    Congxin Wei; Yuansheng Zhang; Xiao Guo; Shaoxing Hui; Manzhong Qin; Ying Zhang

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method...

  4. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  5. The electric and magnetic fields research and public information dissemination (EMF-RAPID) program.

    Science.gov (United States)

    Moulder, J E

    2000-05-01

    In the United States, public concern that exposure to power-line fields was linked to cancer led to the establishment of a Congressionally mandated program, the Electric and Magnetic Fields Research and Public Information Dissemination (EMF-RAPID) Program. A major goal of the program was to "determine whether or not exposures to electric and magnetic fields produced by the generation, transmission, and use of electrical energy affect human health". Between 1994 and 1998, the EMF-RAPID program spent approximately $41 million on biological research. Much of the work funded by the EMF-RAPID program has not yet been published in the peer-reviewed literature. The U.S. National Institute of Environmental Health Sciences (NIEHS) asked that Radiation Research publish this special issue in an attempt to remedy this publication gap. The issue includes reviews of studies that were done to assess the biological plausibility of claims that power-frequency fields caused leukemia and breast cancer. The issue continues with two teratology studies and one immunology study. The section of the issue covering in vitro studies begins with an overview of the efforts NIEHS made to replicate a wide range of reported effects of power-frequency fields and continues with four papers reporting the absence of effects of power-frequency fields on the expression of stress-response genes and oncogenes. Other reports of in vitro studies and studies of mechanisms cover cytotoxicity, gap junction intracellular communication, calcium ion transport across the plasma membrane, and intracellular electric fields.

  6. Assessment of earthquake effects - contribution from online communication

    Science.gov (United States)

    D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline

    2014-05-01

    The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.

  7. iOS and OS X Apps for Exploring Earthquake Activity

    Science.gov (United States)

    Ammon, C. J.

    2015-12-01

    The U.S. Geological Survey and many other agencies rapidly provide information following earthquakes. This timely information garners great public interest and provides a rich opportunity to engage students in discussion and analysis of earthquakes and tectonics. In this presentation I will describe a suite of iOS and Mac OS X apps that I use for teaching and that Penn State employs in outreach efforts in a small museum run by the College of Earth and Mineral Sciences. The iOS apps include a simple, global overview of earthquake activity, epicentral, designed for a quick review or event lookup. A more full-featured iPad app, epicentral-plus, includes a simple global overview along with views that allow a more detailed exploration of geographic regions of interest. In addition, epicentral-plus allows the user to monitor ground motions using seismic channel lists compatible with the IRIS web services. Some limited seismogram processing features are included to allow focus on appropriate signal bandwidths. A companion web site, which includes background material on earthquakes, and a blog that includes sample images and channel lists appropriate for monitoring earthquakes in regions of recent earthquake activity can be accessed through the a third panel in the app. I use epicentral-plus at the beginning of each earthquake seismology class to review recent earthquake activity and to stimulate students to formulate and to ask questions that lead to discussions of earthquake and tectonic processes. Less interactive OS X versions of the apps are used to display a global map of earthquake activity and seismograms in near real time in a small museum on the ground floor of the building hosting Penn State's Geoscience Department.

  8. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.

  9. Incorporating rapid neocortical learning of new schema-consistent information into complementary learning systems theory.

    Science.gov (United States)

    McClelland, James L

    2013-11-01

    The complementary learning systems theory of the roles of hippocampus and neocortex (McClelland, McNaughton, & O'Reilly, 1995) holds that the rapid integration of arbitrary new information into neocortical structures is avoided to prevent catastrophic interference with structured knowledge representations stored in synaptic connections among neocortical neurons. Recent studies (Tse et al., 2007, 2011) showed that neocortical circuits can rapidly acquire new associations that are consistent with prior knowledge. The findings challenge the complementary learning systems theory as previously presented. However, new simulations extending those reported in McClelland et al. (1995) show that new information that is consistent with knowledge previously acquired by a putatively cortexlike artificial neural network can be learned rapidly and without interfering with existing knowledge; it is when inconsistent new knowledge is acquired quickly that catastrophic interference ensues. Several important features of the findings of Tse et al. (2007, 2011) are captured in these simulations, indicating that the neural network model used in McClelland et al. has characteristics in common with neocortical learning mechanisms. An additional simulation generalizes beyond the network model previously used, showing how the rate of change of cortical connections can depend on prior knowledge in an arguably more biologically plausible network architecture. In sum, the findings of Tse et al. are fully consistent with the idea that hippocampus and neocortex are complementary learning systems. Taken together, these findings and the simulations reported here advance our knowledge by bringing out the role of consistency of new experience with existing knowledge and demonstrating that the rate of change of connections in real and artificial neural networks can be strongly prior-knowledge dependent. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  10. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  11. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Aftershocks and Postseismic Effects

    Science.gov (United States)

    Reasenberg, Paul A.

    1997-01-01

    While the damaging effects of the earthquake represent a significant social setback and economic loss, the geophysical effects have produced a wealth of data that have provided important insights into the structure and mechanics of the San Andreas Fault system. Generally, the period after a large earthquake is vitally important to monitor. During this part of the seismic cycle, the primary fault and the surrounding faults, rock bodies, and crustal fluids rapidly readjust in response to the earthquake's sudden movement. Geophysical measurements made at this time can provide unique information about fundamental properties of the fault zone, including its state of stress and the geometry and frictional/rheological properties of the faults within it. Because postseismic readjustments are rapid compared with corresponding changes occurring in the preseismic period, the amount and rate of information that is available during the postseismic period is relatively high. From a geophysical viewpoint, the occurrence of the Loma Prieta earthquake in a section of the San Andreas fault zone that is surrounded by multiple and extensive geophysical monitoring networks has produced nothing less than a scientific bonanza. The reports assembled in this chapter collectively examine available geophysical observations made before and after the earthquake and model the earthquake's principal postseismic effects. The chapter covers four broad categories of postseismic effect: (1) aftershocks; (2) postseismic fault movements; (3) postseismic surface deformation; and (4) changes in electrical conductivity and crustal fluids.

  12. Earthquake Facts

    Science.gov (United States)

    ... to the Atlantic Ocean, around Africa, Asia, and Australia, and under the Pacific Ocean to the west ... are similar to earthquakes, but occur within the ice sheet itself instead of the land underneath the ...

  13. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  14. Earthquake technology fights crime

    Science.gov (United States)

    Lahr, John C.; Ward, Peter L.; Stauffer, Peter H.; Hendley, James W.

    1996-01-01

    Scientists with the U.S. Geological Survey have adapted their methods for quickly finding the exact source of an earthquake to the problem of locating gunshots. On the basis of this work, a private company is now testing an automated gunshot-locating system in a San Francisco Bay area community. This system allows police to rapidly pinpoint and respond to illegal gunfire, helping to reduce crime in our neighborhoods.

  15. Rapid exposure and loss estimates for the May 12, 2008 Mw 7.9 Wenchuan earthquake provided by the U.S. Geological Survey's PAGER system

    Science.gov (United States)

    Earle, P.S.; Wald, D.J.; Allen, T.I.; Jaiswal, K.S.; Porter, K.A.; Hearne, M.G.

    2008-01-01

    One half-hour after the May 12th Mw 7.9 Wenchuan, China earthquake, the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) system distributed an automatically generated alert stating that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater). It was immediately clear that a large-scale disaster had occurred. These alerts were widely distributed and referenced by the major media outlets and used by governments, scientific, and relief agencies to guide their responses. The PAGER alerts and Web pages included predictive ShakeMaps showing estimates of ground shaking, maps of population density, and a list of estimated intensities at impacted cities. Manual, revised alerts were issued in the following hours that included the dimensions of the fault rupture. Within a half-day, PAGER’s estimates of the population exposed to strong shaking levels stabilized at 5.2 million people. A coordinated research effort is underway to extend PAGER’s capability to include estimates of the number of casualties. We are pursuing loss models that will allow PAGER the flexibility to use detailed inventory and engineering results in regions where these data are available while also calculating loss estimates in regions where little is known about the type and strength of the built infrastructure. Prototype PAGER fatality estimates are currently implemented and can be manually triggered. In the hours following the Wenchuan earthquake, these models predicted fatalities in the tens of thousands.

  16. Earthquakes for Kids

    Science.gov (United States)

    ... dug across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History ...

  17. Earthquake Hazards Program: Earthquake Scenarios

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A scenario represents one realization of a potential future earthquake by assuming a particular magnitude, location, and fault-rupture geometry and estimating...

  18. Use of basic biological information for rapid prediction of the response of species to habitat loss.

    Science.gov (United States)

    Hockey, Philip A R; Curtis, Odette E

    2009-02-01

    Much research has focused on identifying traits that can act as useful indicators of how habitat loss affects the extinction risk of species, and the results are mixed. We developed 2 simple, rapid-assessment models of the susceptibility of species to habitat loss. We based both on an index of range size, but one also incorporated an index of body mass and the other an index combining habitat and dietary specialization. We applied the models to samples of birds (Accipitridae and Bucerotidae) and to the lemurs of Madagascar and compared the models' classifications of risk with the IUCN's global threat status of each species. The model derived from ecological attributes was much more robust than the one derived from body mass. Ecological attributes identified threatened birds and lemurs with an average of 80% accuracy and endangered and critically endangered species with 100% accuracy and identified some species not currently listed as threatened that almost certainly warrant conservation consideration. Appropriate analysis of even fairly crude biological information can help raise early-warning flags to the relative susceptibilities of species to habitat loss and thus provide a useful and rapid technique for highlighting potential species-level conservation issues. Advantages of this approach to classifying risk include flexibility in the specialization parameters used as well as its applicability at a range of spatial scales.

  19. Lymphatic transport of exosomes as a rapid route of information dissemination to the lymph node.

    Science.gov (United States)

    Srinivasan, Swetha; Vannberg, Fredrik O; Dixon, J Brandon

    2016-04-18

    It is well documented that cells secrete exosomes, which can transfer biomolecules that impact recipient cells' functionality in a variety of physiologic and disease processes. The role of lymphatic drainage and transport of exosomes is as yet unknown, although the lymphatics play critical roles in immunity and exosomes are in the ideal size-range for lymphatic transport. Through in vivo near-infrared (NIR) imaging we have shown that exosomes are rapidly transported within minutes from the periphery to the lymph node by lymphatics. Using an in vitro model of lymphatic uptake, we have shown that lymphatic endothelial cells actively enhanced lymphatic uptake and transport of exosomes to the luminal side of the vessel. Furthermore, we have demonstrated a differential distribution of exosomes in the draining lymph nodes that is dependent on the lymphatic flow. Lastly, through endpoint analysis of cellular distribution of exosomes in the node, we identified macrophages and B-cells as key players in exosome uptake. Together these results suggest that exosome transfer by lymphatic flow from the periphery to the lymph node could provide a mechanism for rapid exchange of infection-specific information that precedes the arrival of migrating cells, thus priming the node for a more effective immune response.

  20. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  1. Early Earthquakes of the Americas

    Science.gov (United States)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  2. Automated Determination of Magnitude and Source Extent of Large Earthquakes

    Science.gov (United States)

    Wang, Dun

    2017-04-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  3. Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network

    Science.gov (United States)

    Zulfikar, Can; Kariptas, Cagatay; Biyikoglu, Hikmet; Ozarpa, Cevat

    2017-04-01

    Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network Istanbul Natural Gas Distribution Corporation (IGDAS) is one of the end users of the Istanbul Earthquake Early Warning (EEW) signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867km of gas lines with 750 district regulators and 474,000 service boxes. The natural gas comes to Istanbul city borders with 70bar in 30inch diameter steel pipeline. The gas pressure is reduced to 20bar in RMS stations and distributed to district regulators inside the city. 110 of 750 district regulators are instrumented with strong motion accelerometers in order to cut gas flow during an earthquake event in the case of ground motion parameters exceeds the certain threshold levels. Also, state of-the-art protection systems automatically cut natural gas flow when breaks in the gas pipelines are detected. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 750 district regulator sites. IGDAS Real-time Earthquake Risk Reduction algorithm follows 4 stages as below: 1) Real-time ground motion data transmitted from 110 IGDAS and 110 KOERI (Kandilli Observatory and Earthquake Research Institute) acceleration stations to the IGDAS Scada Center and KOERI data center. 2) During an earthquake event EEW information is sent from IGDAS Scada Center to the IGDAS stations. 3) Automatic Shut-Off is applied at IGDAS district regulators, and calculated parameters are sent from stations to the IGDAS Scada Center and KOERI. 4) Integrated building and gas pipeline damage maps are prepared immediately after the earthquake event. The today's technology allows to rapidly estimate the

  4. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  5. LastQuake app: a tool for risk reduction that focuses on earthquakes that really matter to the public!

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2015-12-01

    Many seismic events are only picked up by seismometers but the only earthquakes that really interest the public (and the authorities) are those which are felt by the population. It is not a magnitude issue only; even a small magnitude earthquake, if widely felt can create a public desire for information. In LastQuake, felt events are automatically discriminated through the reactions of the population on the Internet. It uses three different and complementary methods. Twitter Earthquake detection, initially developed by the USGS, detects surges in the number of tweets containing the word "earthquake" in different languages. Flashsourcing, developed by EMSC, detects traffic surges caused by eyewitnesses on its website - one of the top global earthquake information websites. Both detections happen typically within 2 minutes of an event's occurrence. Finally, an earthquake is also confirmed as being felt when at least 3 independent felt reports (questionnaires) are collected. LastQuake automatically merges seismic data, direct (crowdsourced) and indirect eyewitnesses' contributions, damage scenarios and tsunami alerts to provide information on felt earthquakes and their effects in a time ranging from a few tens of seconds to 90 minutes. It is based on visual communication to erase language hurdles, for instance, it crowdsources felt reports through simple cartoons as well as geo-located pics. It was massively adopted in Nepal within hours of the Gorkha earthquake and collected thousands of felt reports and more than 100 informative pics. LastQuake is also a seismic risk reduction tools thanks to its very rapid information. When such information does not exist, people tend to call emergency services, crowds emerge and rumors spread. In its next release, LastQuake will also have "do/don't do" cartoons popping up after an earthquake to encourage appropriate behavior.

  6. Rapid identification of paragonimiasis foci by lay informants in Lao People's Democratic Republic.

    Directory of Open Access Journals (Sweden)

    Peter Odermatt

    Full Text Available BACKGROUND: Paragonimiasis is a food-borne trematodiasis leading to lung disease. Worldwide, an estimated 21 million people are infected. Foci of ongoing transmission remain often unnoticed. We evaluated a simple questionnaire approach using lay-informants at the village level to identify paragonimiasis foci and suspected paragonimiasis cases. METHODOLOGY/PRINCIPAL FINDINGS: The study was carried out in an endemic area of Lao People's Democratic Republic. Leaders of 49 remote villages in northern Vientiane Province were asked to notify suspected paragonimiasis patients using a four-item questionnaire sent through administrative channels: persons responding positively for having chronic cough (more than 3 weeks and/or blood in sputum with or without fever. We validated the village leaders' reports in ten representative villages with a door-to-door survey. We examined three sputa of suspected patients for the presence of Paragonimus eggs and acid fast bacilli. 91.8% of village leaders participated and notified a total of 220 suspected patients; 76.2% were eventually confirmed; an additional 138 suspected cases were found in the survey. Sensitivity of village leaders' notice for "chronic cough" and "blood in sputum" was 100%; "blood in sputum" alone reached a sensitivity of 85.7%. SIGNIFICANCE: Our approach led to the identification of three previously unknown foci of transmission. A rapid and simple lay-informant questionnaire approach is a promising low-cost community diagnostic tool of paragonimiasis control programs.

  7. Diffusion of new technology, health services and information after a crisis: a focus group study of the Sichuan "5.12" Earthquake.

    Science.gov (United States)

    Zhou, Hong; Shi, Lu; Mao, Yuping; Tang, Juan; Zeng, Yu

    2014-01-01

    The Sichuan "5.12" Earthquake in 2008 occurred in a relatively underdeveloped area in China. The rainy weather, the mountainous environment and the local languages all posed major challenges to the dissemination of information and services after the disaster. By adopting a communication perspective, this study applies the diffusion of innovations theory to investigate how healthcare professionals diffused health technologies, health information and services during the rescue and relief operation. The authors conducted three focus group sessions with the health professionals who had attended to the rescue and relief work of the Sichuan "5.12" Earthquake in 2008. A range of questions regarding the diffusion of innovations were asked during these sessions. The health professionals used their cell phones to communicate with other healthcare providers, disseminated knowledge of health risks and injuries to affected residents with pamphlets and posters and attended daily meetings at the local government offices. They reported on the shortage of maritime satellite cell phones and large-size tents for medical use, and the absence of fully equipped ambulances. Volunteers, local health professionals and local officials provided health information and services in different ways. However, the diffusion of health information and services was less likely to reach those living next to transportation centers, in remote areas and in disaster areas neglected by the media. New communication devices such as cell phones and the mobile Internet enabled medical professionals to coordinate the rescue and relief work after this major natural disaster, at a time when the country's emergency response system still had plenty of room for improvement. In future, the mobile Internet should be used as a means of collecting bottom-up disaster reports so that the media will not neglect any disaster areas as they did during the Sichuan Earthquake. Rescue relief work would have been substantially

  8. Satellite-based emergency mapping using optical imagery: experience and reflections from the 2015 Nepal earthquakes

    Science.gov (United States)

    Williams, Jack G.; Rosser, Nick J.; Kincey, Mark E.; Benjamin, Jessica; Oven, Katie J.; Densmore, Alexander L.; Milledge, David G.; Robinson, Tom R.; Jordan, Colm A.; Dijkstra, Tom A.

    2018-01-01

    Landslides triggered by large earthquakes in mountainous regions contribute significantly to overall earthquake losses and pose a major secondary hazard that can persist for months or years. While scientific investigations of coseismic landsliding are increasingly common, there is no protocol for rapid (hours-to-days) humanitarian-facing landslide assessment and no published recognition of what is possible and what is useful to compile immediately after the event. Drawing on the 2015 Mw 7.8 Gorkha earthquake in Nepal, we consider how quickly a landslide assessment based upon manual satellite-based emergency mapping (SEM) can be realistically achieved and review the decisions taken by analysts to ascertain the timeliness and type of useful information that can be generated. We find that, at present, many forms of landslide assessment are too slow to generate relative to the speed of a humanitarian response, despite increasingly rapid access to high-quality imagery. Importantly, the value of information on landslides evolves rapidly as a disaster response develops, so identifying the purpose, timescales, and end users of a post-earthquake landslide assessment is essential to inform the approach taken. It is clear that discussions are needed on the form and timing of landslide assessments, and how best to present and share this information, before rather than after an earthquake strikes. In this paper, we share the lessons learned from the Gorkha earthquake, with the aim of informing the approach taken by scientists to understand the evolving landslide hazard in future events and the expectations of the humanitarian community involved in disaster response.

  9. Transcranial Random Noise Stimulation (tRNS Shapes the Processing of Rapidly Changing Auditory Information

    Directory of Open Access Journals (Sweden)

    Katharina S. Rufener

    2017-06-01

    Full Text Available Neural oscillations in the gamma range are the dominant rhythmic activation pattern in the human auditory cortex. These gamma oscillations are functionally relevant for the processing of rapidly changing acoustic information in both speech and non-speech sounds. Accordingly, there is a tight link between the temporal resolution ability of the auditory system and inherent neural gamma oscillations. Transcranial random noise stimulation (tRNS has been demonstrated to specifically increase gamma oscillation in the human auditory cortex. However, neither the physiological mechanisms of tRNS nor the behavioral consequences of this intervention are completely understood. In the present study we stimulated the human auditory cortex bilaterally with tRNS while EEG was continuously measured. Modulations in the participants’ temporal and spectral resolution ability were investigated by means of a gap detection task and a pitch discrimination task. Compared to sham, auditory tRNS increased the detection rate for near-threshold stimuli in the temporal domain only, while no such effect was present for the discrimination of spectral features. Behavioral findings were paralleled by reduced peak latencies of the P50 and N1 component of the auditory event-related potentials (ERP indicating an impact on early sensory processing. The facilitating effect of tRNS was limited to the processing of near-threshold stimuli while stimuli clearly below and above the individual perception threshold were not affected by tRNS. This non-linear relationship between the signal-to-noise level of the presented stimuli and the effect of stimulation further qualifies stochastic resonance (SR as the underlying mechanism of tRNS on auditory processing. Our results demonstrate a tRNS related improvement in acoustic perception of time critical auditory information and, thus, provide further indices that auditory tRNS can amplify the resonance frequency of the auditory system.

  10. Rapid Integration of Tactile and Visual Information by a Newly Sighted Child.

    Science.gov (United States)

    Chen, Jie; Wu, En-De; Chen, Xin; Zhu, Lu-He; Li, Xiaoman; Thorn, Frank; Ostrovsky, Yuri; Qu, Jia

    2016-04-25

    How we learn to interact with and understand our environment for the first time is an age-old philosophical question. Scientists have long sought to understand what is the origin of egocentric spatial localization and the perceptual integration of touch and visual information. It is difficult to study the beginnings of intermodal visual-motor and visual-tactile linkages in early infancy since infants' muscular strength and control cannot accurately guide visual-motor behavior and they do not concentrate well [1-6]. Alternatively, one can examine young children who have a restored congenital sensory modality loss. They are the best infant substitute if they are old enough for good muscle control and young enough to be within the classic critical period for neuroplasticity [7, 8]. Recovery studies after removal of dense congenital cataracts are examples of this, but most are performed on older subjects [9-14]. We report here the results of video-recorded experiments on a congenitally blind child, beginning immediately after surgical restoration of vision. Her remarkably rapid development of accurate reaching and grasping showed that egocentric spatial localization requires neural circuitry needing less than a half hour of spatially informative experience to be calibrated. 32 hr after first sight, she visually recognized an object that she had simultaneously looked at and held, even though she could not use single senses alone (vision to vision; touch to touch) to perform this recognition until the following day. Then she also performed intersensory transfer of tactile object experience to visual object recognition, demonstrating that the two senses are prearranged to immediately become calibrated to one another. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. The rapid use of gender information: evidence of the time course of pronoun resolution from eyetracking.

    Science.gov (United States)

    Arnold, J E; Eisenband, J G; Brown-Schmidt, S; Trueswell, J C

    2000-07-14

    Eye movements of listeners were monitored to investigate how gender information and accessibility influence the initial processes of pronoun interpretation. Previous studies on this issue have produced mixed results, and several studies have concluded that gender cues are not automatically used during the early processes of pronoun interpretation (e.g. Garnham, A., Oakhill, J. & Cruttenden, H. (1992). The role of implicit causality and gender cue in the interpretation of pronouns. Language and Cognitive Processes, 73 (4), 231-255; Greene, S. B., McKoon, G. & Ratcliff, R. (1992). Pronoun resolution and discourse models. Journal of Experimental Psychology: Learning, Memory, and Cognition, 182, 266-283). In the two experiments presented here, participants viewed a picture with two familiar cartoon characters of either same or different gender. They listened to a text describing the picture, in which a pronoun referred to either the first, more accessible, character, or the second. (For example, Donald is bringing some mail to ¿Mickey/Minnie¿ while a violent storm is beginning. He's carrying an umbrellaellipsis.) The results of both experiments show rapid use of both gender and accessibility at approximately 200 ms after the pronoun offset.

  12. A proteomic approach for the rapid, multi-informative and reliable identification of blood.

    Science.gov (United States)

    Patel, E; Cicatiello, P; Deininger, L; Clench, M R; Marino, G; Giardina, P; Langenburg, G; West, A; Marshall, P; Sears, V; Francese, S

    2016-01-07

    Blood evidence is frequently encountered at the scene of violent crimes and can provide valuable intelligence in the forensic investigation of serious offences. Because many of the current enhancement methods used by crime scene investigators are presumptive, the visualisation of blood is not always reliable nor does it bear additional information. In the work presented here, two methods employing a shotgun bottom up proteomic approach for the detection of blood are reported; the developed protocols employ both an in solution digestion method and a recently proposed procedure involving immobilization of trypsin on hydrophobin Vmh2 coated MALDI sample plate. The methods are complementary as whilst one yields more identifiable proteins (as biomolecular signatures), the other is extremely rapid (5 minutes). Additionally, data demonstrate the opportunity to discriminate blood provenance even when two different blood sources are present in a mixture. This approach is also suitable for old bloodstains which had been previously chemically enhanced, as experiments conducted on a 9-year-old bloodstain deposited on a ceramic tile demonstrate.

  13. Rapid identification information and its influence on the perceived clues at a crime scene: An experimental study.

    Science.gov (United States)

    de Gruijter, Madeleine; Nee, Claire; de Poot, Christianne J

    2017-11-01

    Crime scenes can always be explained in multiple ways. Traces alone do not provide enough information to infer a whole series of events that has taken place; they only provide clues for these inferences. CSIs need additional information to be able to interpret observed traces. In the near future, a new source of information that could help to interpret a crime scene and testing hypotheses will become available with the advent of rapid identification techniques. A previous study with CSIs demonstrated that this information had an influence on the interpretation of the crime scene, yet it is still unknown what exact information was used for this interpretation and for the construction of their scenario. The present study builds on this study and gains more insight into (1) the exact investigative and forensic information that was used by CSIs to construct their scenario, (2) the inferences drawn from this information, and (3) the kind of evidence that was selected at the crime scene to (dis)prove this scenario. We asked 48 CSIs to investigate a potential murder crime scene on the computer and explicate what information they used to construct a scenario and to select traces for analysis. The results show that the introduction of rapid ID information at the start of an investigation contributes to the recognition of different clues at the crime scene, but also to different interpretations of identical information, depending on the kind of information available and the scenario one has in mind. Furthermore, not all relevant traces were recognized, showing that important information can be missed during the investigation. In this study, accurate crime scenarios where mainly build with forensic information, but we should be aware of the fact that crime scenes are always contaminated with unrelated traces and thus be cautious of the power of rapid ID at the crime scene. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights

  14. Inferring Peak Ground Acceleration (PGA) from observed building damage and EO-derived exposure development to develop rapid loss estimates following the April 2015 Nepal earthquake.

    Science.gov (United States)

    Huyck, C. K.

    2016-12-01

    The April 25th 7.8 Gorkha earthquake in Nepal occurred in an area with very few seismic stations. Ground motions were estimated primarily by Ground Motion Prediction Equations (GMPEs) over a very large region, with a very high degree of uncertainty. Accordingly, initial fatality estimates and their distribution was highly uncertain, with a 65% chance of fatalities ranging from 1,000 to 100,000. With an aim to developing estimates of: 1) the number of buildings damaged by category (slight, moderate, extensive, complete), 2) fatalities and their distribution, and 3) rebuilding costs, researchers at ImageCat have developed a preliminary inferred Peak Ground Acceleration product in %g (PGA). The inferred PGA is determined by using observations of building collapse from the National Geospatial Agency and building exposure estimates derived from EO data to determine the percentage of buildings collapsed in key locations. The percentage of building collapse is adjusted for accuracy and cross referenced with composite building damage functions for 4 development patterns in Nepal: 1) sparsely populated, 2) rural, 3) dense development, and 4) urban development to yield an inferred PGA. Composite damage functions are derived from USGS Pager collapse fragility functions (Jaiswal et al., 2011) and are weighted by building type frequencies developed by ImageCat. The PGA is interpolated to yield a surface. An initial estimate of the fatalities based on ATC 13 (Rojan and Sharpe, 1985) using these PGA yields an estimate of: Extensively damaged or destroyed buildings: 225,000 to 450,000 Fatalities: 8,700 to 22,000, with a mean estimate of 15,700. The total number of displaced persons is estimated between 1 and 2 million. Rebuilding costs for building damage only are estimated to be between 2 and 3 billion USD. The inferred PGA product is recommended for use solely in loss estimation processes.

  15. PAGER - Prompt Assessment of Global Earthquakes for Response

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that estimates the impact of significant earthquakes around the world, informing...

  16. Using a Novel Spatial Tool to Inform Invasive Species Early Detection and Rapid Response Efforts

    Science.gov (United States)

    Davidson, Alisha D.; Fusaro, Abigail J.; Kashian, Donna R.

    2015-07-01

    Management of invasive species has increasingly emphasized the importance of early detection and rapid response (EDRR) programs in limiting introductions, establishment, and impacts. These programs require an understanding of vector and species spatial dynamics to prioritize monitoring sites and efficiently allocate resources. Yet managers often lack the empirical data necessary to make these decisions. We developed an empirical mapping tool that can facilitate development of EDRR programs through identifying high-risk locations, particularly within the recreational boating vector. We demonstrated the utility of this tool in the Great Lakes watershed. We surveyed boaters to identify trips among water bodies and to quantify behaviors associated with high likelihood of species transfer (e.g., not removing organic materials from boat trailers) during that trip. We mapped water bodies with high-risk inbound and outbound boater movements using ArcGIS. We also tested for differences in high-risk behaviors based on demographic variables to understand risk differences among boater groups. Incorporation of boater behavior led to identification of additional high-risk water bodies compared to using the number of trips alone. Therefore, the number of trips itself may not fully reflect the likelihood of invasion. This tool can be broadly applied in other geographic contexts and with different taxa, and can be adjusted according to varying levels of information concerning the vector or species of interest. The methodology is straightforward and can be followed after a basic introduction to ArcGIS software. The visual nature of the mapping tool will facilitate site prioritization by managers and stakeholders from diverse backgrounds.

  17. Risk assessment of people trapped in earthquake based on km grid: a case study of the 2014 Ludian earthquake

    Science.gov (United States)

    Wei, Ben-Yong; Nie, Gao-Zhong; Su, Gui-Wu; Sun, Lei

    2017-04-01

    China is one of the most earthquake prone countries in the world. The priority during earthquake emergency response is saving lives and minimizing casualties. Rapid judgment of the trapped location is the important basis for government to reasonable arrange the emergency rescue forces and resources after the earthquake. Through analyzing the key factors resulting in people trapped, we constructed an assessment model of personal trapped (PTED)in collapsed buildings caused by earthquake disaster. Then taking the 2014 Ludian Earthquake as a case, this study evaluated the distribution of trapped personal during this earthquake using the assessment model based on km grid data. Results showed that, there are two prerequisites for people might be trapped by the collapse of buildings in earthquake: earthquake caused buildings collapse and there are people in building when building collapsing; the PTED model could be suitable to assess the trapped people in collapsed buildings caused by earthquake. The distribution of people trapped by the collapse of buildings in the Ludian earthquake assessed by the model is basically the same as that obtained by the actual survey. Assessment of people trapped in earthquake based on km grid can meet the requirements of search-and-rescue zone identification and rescue forces allocation in the early stage of the earthquake emergency. In future, as the basic data become more complete, assessment of people trapped in earthquake based on km grid should provide more accurate and valid suggestions for earthquake emergency search and rescue.

  18. Using Rapid Improvement Events for Disaster After-Action Reviews: Experience in a Hospital Information Technology Outage and Response.

    Science.gov (United States)

    Little, Charles M; McStay, Christopher; Oeth, Justin; Koehler, April; Bookman, Kelly

    2018-02-01

    The use of after-action reviews (AARs) following major emergency events, such as a disaster, is common and mandated for hospitals and similar organizations. There is a recurrent challenge of identified problems not being resolved and repeated in subsequent events. A process improvement technique called a rapid improvement event (RIE) was used to conduct an AAR following a complete information technology (IT) outage at a large urban hospital. Using RIE methodology to conduct the AAR allowed for the rapid development and implementation of major process improvements to prepare for future IT downtime events. Thus, process improvement methodology, particularly the RIE, is suited for conducting AARs following disasters and holds promise for improving outcomes in emergency management. Little CM , McStay C , Oeth J , Koehler A , Bookman K . Using rapid improvement events for disaster after-action reviews: experience in a hospital information technology outage and response. Prehosp Disaster Med. 2018;33(1):98-100.

  19. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  20. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    Science.gov (United States)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  1. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  2. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  3. Rapid Response Fault Drilling Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    Demian M. Saffer

    2009-09-01

    Full Text Available New information about large earthquakes can be acquired by drilling into the fault zone quickly following a large seismic event. Specifically, we can learn about the levels of friction and strength of the fault which determine the dynamic rupture, monitor the healing process of the fault, record the stress changes that trigger aftershocks and capture important physical and chemical properties of the fault that control the rupture process. These scientific and associated technical issues were the focus of a three-day workshop on Rapid Response Fault Drilling: Past, Present, and Future, sponsored by the International Continental Scientific Drilling Program (ICDP and the Southern California Earthquake Center (SCEC. The meeting drewtogether forty-four scientists representing ten countries in Tokyo, Japan during November 2008. The group discussed the scientific problems and how they could be addressed through rapid response drilling. Focused talks presented previous work on drilling after large earthquakes and in fault zones in general, as well as the state of the art of experimental techniques and measurement strategies. Detailed discussion weighed the tradeoffs between rapid drilling andthe ability to satisfy a diverse range of scientific objectives. Plausible drilling sites and scenarios were evaluated. This is a shortened summary of the workshop report that discusses key scientific questions, measurement strategies, and recommendations. This report can provide a starting point for quickly mobilizing a drilling program following future large earthquakes. The full report can be seen at http://www.pmc.ucsc.edu/~rapid/.

  4. Can mobile phone technology support a rapid sharing of information on novel psychoactive substances among health and other professionals internationally?

    Science.gov (United States)

    Simonato, Pierluigi; Bersani, Francesco S; Santacroce, Rita; Cinosi, Eduardo; Schifano, Fabrizio; Bersani, Giuseppe; Martinotti, Giovanni; Corazza, Ornella

    2017-05-01

    The diffusion of novel psychoactive substances (NPSs), combined with the ability of the Internet to act as an online marketplace, has led to unprecedented challenges for governments, health agencies, and substance misuse services. Despite increasing research, there is a paucity of reliable information available to professionals working in the field. The paper will present the pilot results of the first mobile application (SMAIL) for rapid information sharing on NPSs among health professionals. The development of SMAIL was divided into 2 parts: (a) the creation of the application for registered users, enabling them to send an SMS or email with the name or "street name" of an NPS and receive within seconds emails or SMS with the information, when available and (b) the development of a database to support the incoming requests. One hundred twenty-two professionals based in 22 countries used the service over the pilot period of 16 months (from May 2012 to September 2013). Five hundred fifty-seven enquires were made. Users received rapid information on NPSs, and 61% of them rated the service as excellent. This is the right time to use mobile phone technologies for rapid information sharing and prevention activities on NPSs. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Putting down roots in earthquake country-Your handbook for earthquakes in the Central United States

    Science.gov (United States)

    Contributors: Dart, Richard; McCarthy, Jill; McCallister, Natasha; Williams, Robert A.

    2011-01-01

    This handbook provides information to residents of the Central United States about the threat of earthquakes in that area, particularly along the New Madrid seismic zone, and explains how to prepare for, survive, and recover from such events. It explains the need for concern about earthquakes for those residents and describes what one can expect during and after an earthquake. Much is known about the threat of earthquakes in the Central United States, including where they are likely to occur and what can be done to reduce losses from future earthquakes, but not enough has been done to prepare for future earthquakes. The handbook describes such preparations that can be taken by individual residents before an earthquake to be safe and protect property.

  6. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to changes in geologic processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  7. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  8. Thermal Infrared Anomalies of Several Strong Earthquakes

    Directory of Open Access Journals (Sweden)

    Congxin Wei

    2013-01-01

    Full Text Available In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1 There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2 There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3 Thermal radiation anomalies are closely related to the geological structure. (4 Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  9. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  10. Tweeting Earthquakes using TensorFlow

    Science.gov (United States)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  11. Rapid Evidence Assessments of Research to Inform Social Policy: Taking Stock and Moving Forward

    Science.gov (United States)

    Thomas, James; Newman, Mark; Oliver, Sandy

    2013-01-01

    There is a tension between conducting comprehensive systematic reviews and completing them in time to meet policy-making deadlines. The "rapid evidence assessment" has been proposed as a solution to this; offering rigorous reviews in a condensed timescale. While used frequently in healthcare, this mode of reviewing presents considerable…

  12. Transit Reliability Information Program : Reliability Verification Demonstration Plan for Rapid Rail Vehicles

    Science.gov (United States)

    1981-08-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...

  13. April 25, 2015, Gorkha Earthquake, Nepal and Sequence of Aftershocks: Key Lessons

    Science.gov (United States)

    Guragain, R.; Dixit, A. M.; Shrestha, S. N.

    2015-12-01

    The Gorkha Earthquake of M7.8 hit Nepal on April 25, 2015 at 11:56 am local time. The epicenter of this earthquake was Barpak, Gorkha, 80 km northwest of Kathmandu Valley. The main shock was followed by hundreds of aftershocks including M6.6 and M6.7 within 48 hours and M7.3 on May 12, 2015. According to the Government of Nepal, a total of 8,686 people lost their lives, 16,808 people injured, over 500,000 buildings completely collapsed and more than 250,000 building partially damaged. The National Society for Earthquake Technology - Nepal (NSET), a not-for-profit civil society organization that has been focused on earthquake risk reduction in Nepal for past 21 years, conducted various activities to support people and the government in responding to the earthquake disaster. The activities included: i) assisting people and critical facility institutions to conduct rapid visual building damage assessment including the training; ii) information campaign to provide proper information regarding earthquake safety; iii) support rescue organizations on search and rescue operations; and iv) provide technical support to common people on repair, retrofit of damaged houses. NSET is also involved in carrying out studies related to earthquake damage, geotechnical problems, and causes of building damages. Additionally, NSET has done post-earthquake detail damage assessment of buildings throughout the affected areas. Prior to the earthquake, NSET has been working with several institutions to improve seismic performance of school buildings, private residential houses, and other critical structures. Such activities implemented during the past decade have shown the effectiveness of risk reduction. Retrofitted school buildings performed very well during the earthquake. Preparedness activities implemented at community levels have helped communities to respond immediately and save lives. Higher level of earthquake awareness achieved including safe behavior, better understanding of

  14. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2010-08-17

    ... National Institute of Standards and Technology Advisory Committee on Earthquake Hazards Reduction Meeting... meeting. SUMMARY: The Advisory Committee on Earthquake Hazards Reduction (ACEHR or Committee), will meet....m. The primary purpose of this meeting is to receive information on NEHRP earthquake related...

  15. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    and failures-to-predict. The best way to achieve this separation is to use probabilistic rather than deterministic statements in characterizing short-term changes in seismic hazards. The ICEF recommended establishing OEF systems that can provide the public with open, authoritative, and timely information about the short-term probabilities of future earthquakes. Because the public needs to be educated into the scientific conversation through repeated communication of probabilistic forecasts, this information should be made available at regular intervals, during periods of normal seismicity as well as during seismic crises. In an age of nearly instant information and high-bandwidth communication, public expectations regarding the availability of authoritative short-term forecasts are rapidly evolving, and there is a greater danger that information vacuums will spawn informal predictions and misinformation. L'Aquila demonstrates why the development of OEF capabilities is a requirement, not an option.

  16. USGS remote sensing coordination for the 2010 Haiti earthquake

    Science.gov (United States)

    Duda, Kenneth A.; Jones, Brenda

    2011-01-01

    In response to the devastating 12 January 2010, earthquake in Haiti, the US Geological Survey (USGS) provided essential coordinating services for remote sensing activities. Communication was rapidly established between the widely distributed response teams and data providers to define imaging requirements and sensor tasking opportunities. Data acquired from a variety of sources were received and archived by the USGS, and these products were subsequently distributed using the Hazards Data Distribution System (HDDS) and other mechanisms. Within six weeks after the earthquake, over 600,000 files representing 54 terabytes of data were provided to the response community. The USGS directly supported a wide variety of groups in their use of these data to characterize post-earthquake conditions and to make comparisons with pre-event imagery. The rapid and continuing response achieved was enabled by existing imaging and ground systems, and skilled personnel adept in all aspects of satellite data acquisition, processing, distribution and analysis. The information derived from image interpretation assisted senior planners and on-site teams to direct assistance where it was most needed.

  17. Patterns of fault interactions triggered by micro earthquake activity

    OpenAIRE

    V. Mouslopoulou; D. Hristopulos

    2010-01-01

    Historical earthquakes are often strongly clustered in space and time. This clustering has been attributed to static stress triggering associated with tectonic fault interactions and/or fluid migration. Discrimination between these two models requires detailed information on the timing, location and size of earthquakes. The Matata earthquake sequence, which occurred within the active Taupo Rift in New Zealand, provides a unique opportunity to chart spatial and temporal patterns of earthquakes...

  18. Rapid Ethical Assessment on Informed Consent Content and Procedure in Hintalo-Wajirat, Northern Ethiopia: A Qualitative Study.

    Science.gov (United States)

    Abay, Serebe; Addissie, Adamu; Davey, Gail; Farsides, Bobbie; Addissie, Thomas

    2016-01-01

    Informed consent is a key component of bio-medical research involving human participants. However, obtaining informed consent is challenging in low literacy and resource limited settings. Rapid Ethical Assessment (REA) can be used to contextualize and simplify consent information within a given study community. The current study aimed to explore the effects of social, cultural, and religious factors during informed consent process on a proposed HPV-serotype prevalence study. A qualitative community-based REA was conducted in Adigudom and Mynebri Kebeles, Northern Ethiopia, from July to August 2013. Data were collected by a multi-disciplinary team using open ended questions concerning informed consent components in relation to the parent study. The team conducted one-to-one In-Depth Interviews (IDI) and Focus Group Discussions (FGDs) with key informants and community members to collect data based on the themes of the study. Tape recorded data were transcribed in Tigrigna and then translated into English. Data were categorized and thematically analyzed using open coding and content analysis based on pre-defined themes. The REA study revealed a number of socio-cultural issues relevant to the proposed study. Low community awareness about health research, participant rights and cervical cancer were documented. Giving a vaginal sample for testing was considered to be highly embarrassing, whereas giving a blood sample made participants worry that they might be given a result without the possibility of treatment. Verbal consent was preferred to written consent for the proposed study. This rapid ethical assessment disclosed important socio-cultural issues which might act as barriers to informed decision making. The findings were important for contextual modification of the Information Sheet, and to guide the best consent process for the proposed study. Both are likely to have enabled participants to understand the informed consent better and consequently to comply with the

  19. Rapid Ethical Assessment on Informed Consent Content and Procedure in Hintalo-Wajirat, Northern Ethiopia: A Qualitative Study.

    Directory of Open Access Journals (Sweden)

    Serebe Abay

    Full Text Available Informed consent is a key component of bio-medical research involving human participants. However, obtaining informed consent is challenging in low literacy and resource limited settings. Rapid Ethical Assessment (REA can be used to contextualize and simplify consent information within a given study community. The current study aimed to explore the effects of social, cultural, and religious factors during informed consent process on a proposed HPV-serotype prevalence study.A qualitative community-based REA was conducted in Adigudom and Mynebri Kebeles, Northern Ethiopia, from July to August 2013. Data were collected by a multi-disciplinary team using open ended questions concerning informed consent components in relation to the parent study. The team conducted one-to-one In-Depth Interviews (IDI and Focus Group Discussions (FGDs with key informants and community members to collect data based on the themes of the study. Tape recorded data were transcribed in Tigrigna and then translated into English. Data were categorized and thematically analyzed using open coding and content analysis based on pre-defined themes.The REA study revealed a number of socio-cultural issues relevant to the proposed study. Low community awareness about health research, participant rights and cervical cancer were documented. Giving a vaginal sample for testing was considered to be highly embarrassing, whereas giving a blood sample made participants worry that they might be given a result without the possibility of treatment. Verbal consent was preferred to written consent for the proposed study.This rapid ethical assessment disclosed important socio-cultural issues which might act as barriers to informed decision making. The findings were important for contextual modification of the Information Sheet, and to guide the best consent process for the proposed study. Both are likely to have enabled participants to understand the informed consent better and consequently to

  20. Performance of a low-cost earthquake early warning system (P-alert) during the 2016 ML 6.4 Meinong (Taiwan) earthquake

    Science.gov (United States)

    Wu, Y. M.

    2016-12-01

    On February 5, 2016, a moderate earthquake occurred in Southwestern Taiwan with ML 6.4 and a focal depth of 16.7 km. This earthquake caused damage to a few buildings and 117 casualties. A low-cost earthquake early warning (EEW) system (P-alert) is in operation for the purpose of EEW and providing near real-time shake maps. During this event, a detailed shaking map was generated by the P-alert system within two minutes after the earthquake occurrence, and high shaking regions strongly correlated with the locations in which the damage and casualties occurred. In the field, individual P-alert devices also serve as onsite EEW systems using P-wave information. The individual P-alert provided a 4 to 8 s lead time before the arrival of violent shaking in damaged regions. For regional EEW, both the Central Weather Bureau (CWB, official agency) and the P-alert system responded very well. Currently, regional warnings in Taiwan are only provided to cities at epicentral distances of 50 km or more by the CWB. For cities within a 50 km epicentral distance, the P-alert system could be useful for providing onsite EEW. The performance of the P-alert network during this earthquake proves the efficiency of this real-time, low-cost network in terms of early warning (regional and onsite), near real-time shake maps, rapid reports and strong motion data for research purposes.

  1. GeoNet's `Felt Rapid': Collecting What Is Needed, When You Need It, No More, No Less. Rapid, Volumous Data For Response Versus Detailed, Precise Data For Research

    Science.gov (United States)

    Little, C. L.; McBride, S.; Balfour, N.

    2016-12-01

    New Zealand's geohazard monitoring agency, GeoNet, recently implemented `Felt Rapid': earthquake felt reporting that is quick and simple. GeoNet locates 20,000 earthquakes each year with hundreds of those reported as being felt. Starting in the late 1800s, the New Zealand public has become adept at completing felt reports but feedback since the Canterbury Earthquake Sequence suggested that traditional felt reporting was not meeting researchers' or the public's needs. GeoNet required something rapid, adaptable and robust. The solution was Felt Rapid, a mobile app and website where respondents simply pick from 6 cartoon images - representing Modified Mercalli Intensity (MMI) 3-8 - that best aligned to what they felt. For the last decade, felt reporting has been conducted via the GeoNet website, with additional targeted surveys after damaging earthquakes. The vast majority of the submitted felt reports were for earthquakes too small to cause damage, as these are by far the most frequent. Reports from small events are of little interest to researchers who are only concerned with damaging, MMI6 and above. However, we found that when damaging earthquakes did occur, such as Christchurch's M6.3, they were only sparsely reported (3,776 reports). Understandably, sitting at a computer and completing a lengthy online form wasn't a priority for people after a devastating earthquake. With Felt Rapid, reporting has to be completed within an hour of an earthquake, the use of GeoNet's automatically compiled felt reporting maps had evolved; their main purpose is immediate assessment of an earthquake's impact on populations, and is used by Civil Defence agencies. Reports are immediately displayed on an interactive map via the website and mobile app. With over 250,000 users this provides rapid and robust information regarding the experienced shaking. When a damaging earthquake occurs and researchers want to collect important and rare damaging felt reports, a separate in-depth survey

  2. Construction of cryptographic information protection in automated control systems for rapid reaction military forces

    Directory of Open Access Journals (Sweden)

    Sergey Petrovich Evseev

    2012-04-01

    Full Text Available New approaches to realizations of military operations are analyzed. The main factors that directly affect the construction and operation of information security subsystems in prospective automated command and control military systems are described. Possible ways of the construction of cryptographic subsystems of information protection in automated operation management systems for united military force groups are investigated.

  3. Small and large earthquakes: evidence for a different rupture beginning

    Science.gov (United States)

    Festa, G.; Colombelli, S.; Zollo, A.; Picozzi, M.

    2014-12-01

    The process of earthquake rupture nucleation and propagation has been investigated through laboratory experiments and theoretical modelling, but a limited number of observations exist at the scale of earthquake fault zones. Distinct models have been proposed, and whether the magnitude can be predicted while the rupture is ongoing represents an unsolved question. The ability to correctly distinguish a small shock from a large event through the analysis of the first P-wave observation is crucial for risk mitigation actions triggered by earthquake early warning systems. Here we show that the evolution of P-wave peak displacement with time is informative regarding the early stage of the rupture process and can be used as a proxy for the final size of the rupture. In the present study, we measure the peak displacement amplitude of filtered P-wave signals over a progressively expanding P-wave time window, starting from the P-wave onset time, and expanding the time window until the expected arrival of the S-waves. We use a large, high-quality dataset of 43 moderate-to-strong Japanese events, in the magnitude range between 4 and 9. We analyzed more than 7000 three-component waveforms recorded at 1,208 stations, spanning a wide distance range (0-500 km). We study the relationship between the time evolution of the peak displacement and the earthquake magnitude itself to investigate a possible different scaling for small and large events. For the analyzed earthquake set, we found that the initial evolution of peak displacement is different between small and large earthquakes. In particular, we show a rapid initial increase of the peak displacement for small events and a slower growth for larger ones. The figure shows the average values of P-wave peak displacement for some representative events while the insert box shows the expected initial slope of the curves for different magnitudes. This result suggests that the evolution of P-wave peak displacement holds information

  4. Invasive species information networks: Collaboration at multiple scales for prevention, early detection, and rapid response to invasive alien species

    Science.gov (United States)

    Simpson, Annie; Jarnevich, Catherine S.; Madsen, John; Westbrooks, Randy G.; Fournier, Christine; Mehrhoff, Les; Browne, Michael; Graham, Jim; Sellers, Elizabeth A.

    2009-01-01

    Accurate analysis of present distributions and effective modeling of future distributions of invasive alien species (IAS) are both highly dependent on the availability and accessibility of occurrence data and natural history information about the species. Invasive alien species monitoring and detection networks (such as the Invasive Plant Atlas of New England and the Invasive Plant Atlas of the MidSouth) generate occurrence data at local and regional levels within the United States, which are shared through the US National Institute of Invasive Species Science. The Inter-American Biodiversity Information Network's Invasives Information Network (I3N), facilitates cooperation on sharing invasive species occurrence data throughout the Western Hemisphere. The I3N and other national and regional networks expose their data globally via the Global Invasive Species Information Network (GISIN). International and interdisciplinary cooperation on data sharing strengthens cooperation on strategies and responses to invasions. However, limitations to effective collaboration among invasive species networks leading to successful early detection and rapid response to invasive species include: lack of interoperability; data accessibility; funding; and technical expertise. This paper proposes various solutions to these obstacles at different geographic levels and briefly describes success stories from the invasive species information networks mentioned above. Using biological informatics to facilitate global information sharing is especially critical in invasive species science, as research has shown that one of the best indicators of the invasiveness of a species is whether it has been invasive elsewhere. Data must also be shared across disciplines because natural history information (e.g. diet, predators, habitat requirements, etc.) about a species in its native range is vital for effective prevention, detection, and rapid response to an invasion. Finally, it has been our

  5. Rapid detection of a norovirus pseudo-outbreak by using real-time sequence based information

    NARCIS (Netherlands)

    Rahamat-Langendoen, J. C.; Lokate, M.; Scholvinck, E. H.; Friedrich, A. W.; Niesters, H. G. M.

    Background: Sequence based information is increasingly used to study the epidemiology of viruses, not only to provide insight in viral evolution, but also to understand transmission patterns during outbreaks. However, sequence analysis is not yet routinely performed by diagnostic laboratories,

  6. Evidence of Rapid Modulation by Social Information of Subjective, Physiological, and Neural Responses to Emotional Expressions

    Directory of Open Access Journals (Sweden)

    Martial Mermillod

    2018-01-01

    Full Text Available Recent research suggests that conceptual or emotional factors could influence the perceptual processing of stimuli. In this article, we aimed to evaluate the effect of social information (positive, negative, or no information related to the character of the target on subjective (perceived and felt valence and arousal, physiological (facial mimicry as well as on neural (P100 and N170 responses to dynamic emotional facial expressions (EFE that varied from neutral to one of the six basic emotions. Across three studies, the results showed reduced ratings of valence and arousal of EFE associated with incongruent social information (Study 1, increased electromyographical responses (Study 2, and significant modulation of P100 and N170 components (Study 3 when EFE were associated with social (positive and negative information (vs. no information. These studies revealed that positive or negative social information reduces subjective responses to incongruent EFE and produces a similar neural and physiological boost of the early perceptual processing of EFE irrespective of their congruency. In conclusion, the article suggests that the presence of positive or negative social context modulates early physiological and neural activity preceding subsequent behavior.

  7. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    Science.gov (United States)

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  8. @INGVterremoti: Tweeting the Automatic Detection of Earthquakes

    Science.gov (United States)

    Casarotti, E.; Amato, A.; Comunello, F.; Lauciani, V.; Nostro, C.; Polidoro, P.

    2014-12-01

    The use of social media is emerging as a powerful tool fordisseminating trusted information about earthquakes. Since 2009, theTwitter account @INGVterremoti provides constant and timely detailsabout M2+ seismic events detected by the Italian National SeismicNetwork, directly connected with the seismologists on duty at IstitutoNazionale di Geofisica e Vulcanologia (INGV). After the 2012 seismicsequence, the account has been awarded by a national prize as the"most useful Twitter account". Currently, it updates more than 110,000followers (one the first 50 Italian Twitter accounts for number offollowers). Nevertheless, since it provides only the manual revisionof seismic parameters, the timing (approximately between 10 and 20minutes after an event) has started to be under evaluation.Undeniably, mobile internet, social network sites and Twitter in particularrequire a more rapid and "real-time" reaction.During the last 18 months, INGV tested the tweeting of the automaticdetection of M3+ earthquakes, obtaining results reliable enough to bereleased openly 1 or 2 minutes after a seismic event. During the summerof 2014, INGV, with the collaboration of CORIS (Department ofCommunication and Social Research, Sapienza University of Rome),involved the followers of @INGVterremoti and citizens, carrying out aquali-quantitative study (through in-depth interviews and a websurvey) in order to evaluate the best format to deliver suchinformation. In this presentation we will illustrate the results of the reliability test and theanalysis of the survey.

  9. A rapid estimation of near field tsunami run-up

    Science.gov (United States)

    Riqueime, Sebastian; Fuentes, Mauricio; Hayes, Gavin; Campos, Jamie

    2015-01-01

    Many efforts have been made to quickly estimate the maximum run-up height of tsunamis associated with large earthquakes. This is a difficult task, because of the time it takes to construct a tsunami model using real time data from the source. It is possible to construct a database of potential seismic sources and their corresponding tsunami a priori.However, such models are generally based on uniform slip distributions and thus oversimplify the knowledge of the earthquake source. Here, we show how to predict tsunami run-up from any seismic source model using an analytic solution, that was specifically designed for subduction zones with a well defined geometry, i.e., Chile, Japan, Nicaragua, Alaska. The main idea of this work is to provide a tool for emergency response, trading off accuracy for speed. The solutions we present for large earthquakes appear promising. Here, run-up models are computed for: The 1992 Mw 7.7 Nicaragua Earthquake, the 2001 Mw 8.4 Perú Earthquake, the 2003Mw 8.3 Hokkaido Earthquake, the 2007 Mw 8.1 Perú Earthquake, the 2010 Mw 8.8 Maule Earthquake, the 2011 Mw 9.0 Tohoku Earthquake and the recent 2014 Mw 8.2 Iquique Earthquake. The maximum run-up estimations are consistent with measurements made inland after each event, with a peak of 9 m for Nicaragua, 8 m for Perú (2001), 32 m for Maule, 41 m for Tohoku, and 4.1 m for Iquique. Considering recent advances made in the analysis of real time GPS data and the ability to rapidly resolve the finiteness of a large earthquake close to existing GPS networks, it will be possible in the near future to perform these calculations within the first minutes after the occurrence of similar events. Thus, such calculations will provide faster run-up information than is available from existing uniform-slip seismic source databases or past events of pre-modeled seismic sources.

  10. Tohoku earthquake: a surprise?

    CERN Document Server

    Kagan, Yan Y

    2011-01-01

    We consider three issues related to the 2011 Tohoku mega-earthquake: (1) how to evaluate the earthquake maximum size in subduction zones, (2) what is the repeat time for the largest earthquakes in Tohoku area, and (3) what are the possibilities of short-term forecasts during the 2011 sequence. There are two quantitative methods which can be applied to estimate the maximum earthquake size: a statistical analysis of the available earthquake record and the moment conservation principle. The latter technique studies how much of the tectonic deformation rate is released by earthquakes. For the subduction zones, the seismic or historical record is not sufficient to provide a reliable statistical measure of the maximum earthquake. The moment conservation principle yields consistent estimates of maximum earthquake size: for all the subduction zones the magnitude is of the order 9.0--9.7, and for major subduction zones the maximum earthquake size is statistically indistinguishable. Starting in 1999 we have carried out...

  11. 7th U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research: Abstract Volume and Technical Program

    Science.gov (United States)

    Detweiler, Shane T.; Ellsworth, William L.

    2008-01-01

    The U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research promotes advanced study toward a more fundamental understanding of the earthquake process and hazard estimation. The Panel promotes basic and applied research to improve our understanding of the causes and effects of earthquakes and to facilitate the transmission of research results to those who implement hazard reduction measures on both sides of the Pacific and around the world. Meetings are held every other year, and alternate between countries with short presentation on current research and local field trips being the highlights. The 5th Joint Panel meeting was held at Asilomar, California in October, 2004. The technical sessions featured reports on the September 28, 2004 Parkfield, California earthquake, progress on earthquake early warning and rapid post-event assessment technology, probabilistic earthquake forecasting and the newly discovered phenomenon of nonvolcanic tremor. The Panel visited the epicentral region of the M 6.0 Parkfield earthquake and viewed the surface ruptures along the San Andreas Fault. They also visited the San Andreas Fault Observatory at Depth (SAFOD), which had just completed the first phase of drilling into the fault. The 6th Joint Panel meeting was held in Tokushima, Japan in November, 2006. The meeting included very productive exchanges of information on approaches to systematic observation of earthquake processes. Sixty eight technical papers were presented during the meeting on a wide range of subjects, including interplate earthquakes in subduction zones, slow slip and nonvolcanic tremor, crustal deformation, recent earthquake activity and hazard mapping. Through our discussion, we reaffirmed the benefits of working together to achieve our common goal of reducing earthquake hazard, continued cooperation on issues involving densification of observation networks and the open exchange of data among scientific communities. We also reaffirmed the importance of

  12. ELER software – a new tool for urban earthquake loss assessment

    Directory of Open Access Journals (Sweden)

    U. Hancilar

    2010-12-01

    Full Text Available Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3 of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0 of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments.

    This paper introduces the urban earthquake loss assessment module (Level 2 of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996, Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005, Reduction Factor Method (Fajfar, 2000 and Coefficient Method (ASCE 41-06, 2006, are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4

  13. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    Science.gov (United States)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  14. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    Science.gov (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  15. Rapid RNA-ligand interaction analysis through high-information content conformational and stability landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Baird, Nathan J. [National Inst. of Health (NIH), Bethesda, MD (United States); Inglese, James [National Inst. of Health (NIH), Bethesda, MD (United States); Ferré-D’Amaré, Adrian R. [National Inst. of Health (NIH), Bethesda, MD (United States)

    2015-12-07

    The structure and biological properties of RNAs are a function of changing cellular conditions, but comprehensive, simultaneous investigation of the effect of multiple interacting environmental variables is not easily achieved. We have developed an efficient, high-throughput method to characterize RNA structure and thermodynamic stability as a function of multiplexed solution conditions using Förster resonance energy transfer (FRET). In a single FRET experiment using conventional quantitative PCR instrumentation, 19,400 conditions of MgCl2, ligand and temperature are analysed to generate detailed empirical conformational and stability landscapes of the cyclic diguanylate (c-di-GMP) riboswitch. This method allows rapid comparison of RNA structure modulation by cognate and non-cognate ligands. Landscape analysis reveals that kanamycin B stabilizes a non-native, idiosyncratic conformation of the riboswitch that inhibits c-di-GMP binding. Our research demonstrates that allosteric control of folding, rather than direct competition with cognate effectors, is a viable approach for pharmacologically targeting riboswitches and other structured RNA molecules.

  16. A rapid evidence-based service by librarians provided information to answer primary care clinical questions.

    Science.gov (United States)

    McGowan, Jessie; Hogg, William; Rader, Tamara; Salzwedel, Doug; Worster, Danielle; Cogo, Elise; Rowan, Margo

    2010-03-01

    A librarian consultation service was offered to 88 primary care clinicians during office hours. This included a streamlined evidence-based process to answer questions in fewer than 20 min. This included a contact centre accessed through a Web-based platform and using hand-held devices and computers with Web access. Librarians were given technical training in evidence-based medicine, including how to summarise evidence. To describe the process and lessons learned from developing and operating a rapid response librarian consultation service for primary care clinicians. Evaluation included librarian interviews and a clinician exit satisfaction survey. Clinicians were positive about its impact on their clinical practice and decision making. The project revealed some important 'lessons learned' in the clinical use of hand-held devices, knowledge translation and training for clinicians and librarians. The Just-in-Time Librarian Consultation Service showed that it was possible to provide evidence-based answers to clinical questions in 15 min or less. The project overcame a number of barriers using innovative solutions. There are many opportunities to build on this experience for future joint projects of librarians and healthcare providers.

  17. Neurophysiological basis of rapid eye movement sleep behavior disorder: informing future drug development

    Directory of Open Access Journals (Sweden)

    Jennum P

    2016-04-01

    Full Text Available Poul Jennum, Julie AE Christensen, Marielle Zoetmulder Department of Clinical Neurophysiology, Faculty of Health Sciences, Danish Center for Sleep Medicine, Rigshospitalet, University of Copenhagen, Copenhagen, Denmark Abstract: Rapid eye movement (REM sleep behavior disorder (RBD is a parasomnia characterized by a history of recurrent nocturnal dream enactment behavior and loss of skeletal muscle atonia and increased phasic muscle activity during REM sleep: REM sleep without atonia. RBD and associated comorbidities have recently been identified as one of the most specific and potentially sensitive risk factors for later development of any of the alpha-synucleinopathies: Parkinson’s disease, dementia with Lewy bodies, and other atypical parkinsonian syndromes. Several other sleep-related abnormalities have recently been identified in patients with RBD/Parkinson’s disease who experience abnormalities in sleep electroencephalographic frequencies, sleep–wake transitions, wake and sleep stability, occurrence and morphology of sleep spindles, and electrooculography measures. These findings suggest a gradual involvement of the brainstem and other structures, which is in line with the gradual involvement known in these disorders. We propose that these findings may help identify biomarkers of individuals at high risk of subsequent conversion to parkinsonism. Keywords: motor control, brain stem, hypothalamus, hypocretin

  18. Impaired encoding of rapid pitch information underlies perception and memory deficits in congenital amusia.

    Science.gov (United States)

    Albouy, Philippe; Cousineau, Marion; Caclin, Anne; Tillmann, Barbara; Peretz, Isabelle

    2016-01-06

    Recent theories suggest that the basis of neurodevelopmental auditory disorders such as dyslexia or specific language impairment might be a low-level sensory dysfunction. In the present study we test this hypothesis in congenital amusia, a neurodevelopmental disorder characterized by severe deficits in the processing of pitch-based material. We manipulated the temporal characteristics of auditory stimuli and investigated the influence of the time given to encode pitch information on participants' performance in discrimination and short-term memory. Our results show that amusics' performance in such tasks scales with the duration available to encode acoustic information. This suggests that in auditory neuro-developmental disorders, abnormalities in early steps of the auditory processing can underlie the high-level deficits (here musical disabilities). Observing that the slowing down of temporal dynamics improves amusics' pitch abilities allows considering this approach as a potential tool for remediation in developmental auditory disorders.

  19. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    Science.gov (United States)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  20. U.S. Tsunami Information technology (TIM) Modernization:Developing a Maintainable and Extensible Open Source Earthquake and Tsunami Warning System

    Science.gov (United States)

    Hellman, S. B.; Lisowski, S.; Baker, B.; Hagerty, M.; Lomax, A.; Leifer, J. M.; Thies, D. A.; Schnackenberg, A.; Barrows, J.

    2015-12-01

    Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). While this project was funded by NOAA to solve a specific problem, the requirements that the delivered system be both open source and easily maintainable have resulted in the creation of a variety of open source (OS) software packages. The open source software is now complete and this is a presentation of the OS Software that has been funded by NOAA for benefit of the entire seismic community. The design architecture comprises three distinct components: (1) The user interface, (2) The real-time data acquisition and processing system and (3) The scientific algorithm library. The system follows a modular design with loose coupling between components. We now identify the major project constituents. The user interface, CAVE, is written in Java and is compatible with the existing National Weather Service (NWS) open source graphical system AWIPS. The selected real-time seismic acquisition and processing system is open source SeisComp3 (sc3). The seismic library (libseismic) contains numerous custom written and wrapped open source seismic algorithms (e.g., ML/mb/Ms/Mwp, mantle magnitude (Mm), w-phase moment tensor, bodywave moment tensor, finite-fault inversion, array processing). The seismic library is organized in a way (function naming and usage) that will be familiar to users of Matlab. The seismic library extends sc3 so that it can be called by the real-time system, but it can also be driven and tested outside of sc3, for example, by ObsPy or Earthworm. To unify the three principal components we have developed a flexible and lightweight communication layer called SeismoEdex.

  1. Motivation and challenges for use of malaria rapid diagnostic tests among informal providers in Myanmar: a qualitative study.

    Science.gov (United States)

    Sudhinaraset, May; Briegleb, Christina; Aung, Moe; Khin, Hnin Su Su; Aung, Tin

    2015-02-06

    Rapid diagnostic tests (RDTs) for malaria enable proper diagnosis and have been shown to reduce overuse of artemisinin combination therapy. Few studies have evaluated the feasibility and use of RDTs in the private sector in Myanmar. The objectives of the study were to: 1) understand the acceptability of using RDTs in the informal sector in Myanmar; 2) examine motivations for use among informal providers; and, 3) highlight decision-making and knowledge of providers for diagnostic testing and treatment. Qualitative interviews were conducted with 30 informal providers. Purposeful sampling was used to enrol study participants in the Mon and Shan State in Myanmar. All interviews were conducted in Burmese, translated into English, and two researchers coded all interviews using Atlas ti. Major themes identified included: 1) informal provider and outlet characteristics, including demographic and background characteristics; 2) the benefits and challenges of using RDTs according to providers; 3) provider experiences with using RDTs, including motivations for using the RDT; 4) adherence to test results, either positive or negative; and, 5) recommendations from informal providers to promote increased use of RDTs in their communities. This study found that introducing RDTs to informal providers in Myanmar was feasible, resulting in improved provider empowerment and patient-provider relationships. Specific challenges included facility infrastructure to use and dispose RDTs and provider knowledge. This varied across the type of informal provider, with itinerant drug vendors more comfortable and knowledgeable about RDTs compared to general retail sellers and medical drug representatives. This study found informal providers in Myanmar found the introduction of RDTs to be highly acceptable. Providers discussed improvement in service quality including provider empowerment and patient-provider relationships. The study also highlighted a number of challenges that informal providers

  2. Sensitivity of Detecting Shallow Slip in Tsunami Earthquakes Using High-Rate Seismogeodetic Displacement and Velocity Waveforms

    Science.gov (United States)

    Saunders, J. K.; Haase, J. S.; Bock, Y.

    2016-12-01

    The shallowest portion of the subduction zone, from the trench to 15 km depth, can facilitate long-duration, moderately sized earthquakes that have anomalously low seismic moment and short-period energy release accompanying large coseismic offsets. Such events have been dubbed `tsunami earthquakes' due to their disproportionately large tsunamis compared to their magnitude. The central portion of the subduction zone ( 15-50 km depth) is associated with generally higher rigidity and is where the majority of great subduction earthquakes nucleate. With increasing earthquake magnitude, the long-period and permanent deformations dominate motions in the near field, making displacement observations ideal for near-field earthquake characterization. In this study, we determine the sensitivity of seismogeodetic (combined high-rate GPS and accelerations) waveforms to slip location (shallow versus deeper slip) for the Cascadia subduction zone in order to assess their usefulness in rapidly determining the potential hazard of tsunami earthquakes with relatively small moment. We generate simple kinematic slip models that exhibit source properties according to the different subduction zone domains: a shallow Mw8 earthquake rupture, representing a tsunami earthquake, and a deeper earthquake of the same magnitude and slip distribution that does not rupture into the shallow portion. We choose rupture speed and slip duration for each simulation based on compilations of kinematic rupture models from past earthquakes. Specifically, the rupture speed is lowered and the slip duration is extended in the shallow earthquake to represent slip in the weaker, sediment-rich region near the trench. We generate synthetic waveforms and compare these displacements and velocities among scenarios to determine where there are differences that constrain the region and amount of slip. We then assess how well the current collocated seismogeodetic network samples the ground motions in critical areas for

  3. The Information Technology Program Manager’s Dilemma: Rapidly Evolving Technology and Stagnant Processes

    Science.gov (United States)

    2010-08-01

    information technology systems. The current DoDI 5000.02 leaves IT project and program managers wondering how the current process applies to them, as the guidance is fairly rigid and does not allow for the flexibility required to appropriately manage IT programs. Until very recently, in comparison to the development of a traditional weapons system, IT programs seemed to have been viewed as a utility or service instead of a critical component to national security. Perhaps that is because data passing through cables cannot be observed with the naked senses and therefore an

  4. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  5. Earthquake Damage - General

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — An earthquake is the motion or trembling of the ground produced by sudden displacement of rock in the Earth's crust. Earthquakes result from crustal strain,...

  6. Earthquakes in Southern California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — There have been many earthquake occurrences in Southern California. This set of slides shows earthquake damage from the following events: Imperial Valley, 1979,...

  7. Earthquake Notification Service

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Earthquake Notification Service (ENS) is a free service that sends you automated notifications to your email or cell phone when earthquakes happen.

  8. Shared care in mental illness: A rapid review to inform implementation

    Directory of Open Access Journals (Sweden)

    Kelly Brian J

    2011-11-01

    Full Text Available Abstract Background While integrated primary healthcare for the management of depression has been well researched, appropriate models of primary care for people with severe and persistent psychotic disorders are poorly understood. In 2010 the NSW (Australia Health Department commissioned a review of the evidence on "shared care" models of ambulatory mental health services. This focussed on critical factors in the implementation of these models in clinical practice, with a view to providing policy direction. The review excluded evidence about dementia, substance use and personality disorders. Methods A rapid review involving a search for systematic reviews on The Cochrane Database of Systematic Reviews and Database of Abstracts of Reviews of Effects (DARE. This was followed by a search for papers published since these systematic reviews on Medline and supplemented by limited iterative searching from reference lists. Results Shared care trials report improved mental and physical health outcomes in some clinical settings with improved social function, self management skills, service acceptability and reduced hospitalisation. Other benefits include improved access to specialist care, better engagement with and acceptability of mental health services. Limited economic evaluation shows significant set up costs, reduced patient costs and service savings often realised by other providers. Nevertheless these findings are not evident across all clinical groups. Gains require substantial cross-organisational commitment, carefully designed and consistently delivered interventions, with attention to staff selection, training and supervision. Effective models incorporated linkages across various service levels, clinical monitoring within agreed treatment protocols, improved continuity and comprehensiveness of services. Conclusions "Shared Care" models of mental health service delivery require attention to multiple levels (from organisational to individual

  9. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  10. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  11. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  12. Using INGRES as a rapid prototyping device during development of management information applications

    Energy Technology Data Exchange (ETDEWEB)

    Brice, L.; Connell, J.; Shafer, D.

    1983-01-01

    This paper presents case studies from the Administrative Data Processing Division of the Los Alamos National Laboratory where a prototyping too, the INGRES relational database system, has been used to develop management information systems. The tool has proved valuable in satisfying user requirements and expectations, and in aiding data processing in the analysis and specification phases of the system life cycle. The prototype approach helps enormously in bridging the developer-user communication gap and has been found to add a negligible amount of cost to the entire software development project. Presented here are four case studies of how INGRES has been employed in prototyping. Also presented are examples of specific INGRES features and how they were used in one of the case studies and further examples involving another similar case. Special considerations and cautions are required when using INGRES for prototyping, but the overall conclusion is that it is a tool which has tremendously benefited our organization. Whether the final implemented system is INGRES-based or not, prototyping greatly enhances the possibility of complete, correct and unambiguous specifications prior to final software product development.

  13. "Breaking Ground" in the Use of Social Media: A Case Study of a University Earthquake Response to Inform Educational Design with Facebook

    Science.gov (United States)

    Dabner, Nicki

    2012-01-01

    On September 4 2010, a massive 7.1 magnitude earthquake struck the Canterbury region in the South Island of New Zealand. The response from the University of Canterbury was immediate and carefully co-ordinated, with the university's web-based environment and a responsive site developed on the social media platform "Facebook" becoming…

  14. New geological perspectives on earthquake recurrence models

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, D.P. [Geological Survey, Menlo Park, CA (United States)

    1997-02-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release.

  15. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  16. Patterns of fault interactions triggered by micro earthquake activity

    Science.gov (United States)

    Mouslopoulou, Vasiliki; Hristopulos, Dionisis

    2010-05-01

    Historical earthquakes are often strongly clustered in space and time. This clustering has been attributed to static stress triggering associated with tectonic fault interactions and/or fluid migration. Discrimination between these two models requires detailed information on the timing, location and size of earthquakes. The Matata earthquake sequence, which occurred within the active Taupo Rift in New Zealand, provides a unique opportunity to chart spatial and temporal patterns of earthquakes along individual faults, between neighbouring faults and within the entire fault system over timescales of days to years. This is due to the accurate relocation of 2563 small to moderate size earthquakes (15.5) that occurred over thousand-year timescales (e.g. <60 kyr). Therefore, if the origin of small-to-moderate-sized earthquakes in the rift is indeed tectonic and fault interaction is a scale invariant process, it may be possible to constrain better the occurrence of large earthquakes by analysing widely available microearthquake data.

  17. Modeling fast and slow earthquakes at various scales.

    Science.gov (United States)

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  18. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  19. High Attenuation Rate for Shallow, Small Earthquakes in Japan

    Science.gov (United States)

    Si, Hongjun; Koketsu, Kazuki; Miyake, Hiroe

    2017-09-01

    We compared the attenuation characteristics of peak ground accelerations (PGAs) and velocities (PGVs) of strong motion from shallow, small earthquakes that occurred in Japan with those predicted by the equations of Si and Midorikawa (J Struct Constr Eng 523:63-70, 1999). The observed PGAs and PGVs at stations far from the seismic source decayed more rapidly than the predicted ones. The same tendencies have been reported for deep, moderate, and large earthquakes, but not for shallow, moderate, and large earthquakes. This indicates that the peak values of ground motion from shallow, small earthquakes attenuate more steeply than those from shallow, moderate or large earthquakes. To investigate the reason for this difference, we numerically simulated strong ground motion for point sources of M w 4 and 6 earthquakes using a 2D finite difference method. The analyses of the synthetic waveforms suggested that the above differences are caused by surface waves, which are predominant at stations far from the seismic source for shallow, moderate earthquakes but not for shallow, small earthquakes. Thus, although loss due to reflection at the boundaries of the discontinuous Earth structure occurs in all shallow earthquakes, the apparent attenuation rate for a moderate or large earthquake is essentially the same as that of body waves propagating in a homogeneous medium due to the dominance of surface waves.

  20. Cochrane Rapid Reviews Methods Group to play a leading role in guiding the production of informed high-quality, timely research evidence syntheses

    Directory of Open Access Journals (Sweden)

    Chantelle Garritty

    2016-10-01

    Full Text Available Abstract Background Policymakers and healthcare stakeholders are increasingly seeking evidence to inform the policymaking process, and often use existing or commissioned systematic reviews to inform decisions. However, the methodologies that make systematic reviews authoritative take time, typically 1 to 2 years to complete. Outside the traditional SR timeline, “rapid reviews” have emerged as an efficient tool to get evidence to decision-makers more quickly. However, the use of rapid reviews does present challenges. To date, there has been limited published empirical information about this approach to compiling evidence. Thus, it remains a poorly understood and ill-defined set of diverse methodologies with various labels. In recent years, the need to further explore rapid review methods, characteristics, and their use has been recognized by a growing network of healthcare researchers, policymakers, and organizations, several with ties to Cochrane, which is recognized as representing an international gold standard for high-quality, systematic reviews. Purpose In this commentary, we introduce the newly established Cochrane Rapid Reviews Methods Group developed to play a leading role in guiding the production of rapid reviews given they are increasingly employed as a research synthesis tool to support timely evidence-informed decision-making. We discuss how the group was formed and outline the group’s structure and remit. We also discuss the need to establish a more robust evidence base for rapid reviews in the published literature, and the importance of promoting registration of rapid review protocols in an effort to promote efficiency and transparency in research. Conclusion As with standard systematic reviews, the core principles of evidence-based synthesis should apply to rapid reviews in order to minimize bias to the extent possible. The Cochrane Rapid Reviews Methods Group will serve to establish a network of rapid review stakeholders

  1. Sichuan Earthquake in China

    Science.gov (United States)

    2008-01-01

    The Sichuan earthquake in China occurred on May 12, 2008, along faults within the mountains, but near and almost parallel the mountain front, northwest of the city of Chengdu. This major quake caused immediate and severe damage to many villages and cities in the area. Aftershocks pose a continuing danger, but another continuing hazard is the widespread occurrence of landslides that have formed new natural dams and consequently new lakes. These lakes are submerging roads and flooding previously developed lands. But an even greater concern is the possible rapid release of water as the lakes eventually overflow the new dams. The dams are generally composed of disintegrated rock debris that may easily erode, leading to greater release of water, which may then cause faster erosion and an even greater release of water. This possible 'positive feedback' between increasing erosion and increasing water release could result in catastrophic debris flows and/or flooding. The danger is well known to the Chinese earthquake response teams, which have been building spillways over some of the new natural dams. This ASTER image, acquired on June 1, 2008, shows two of the new large landslide dams and lakes upstream from the town of Chi-Kua-Kan at 32o12'N latitude and 104o50'E longitude. Vegetation is green, water is blue, and soil is grayish brown in this enhanced color view. New landslides appear bright off-white. The northern (top) lake is upstream from the southern lake. Close inspection shows a series of much smaller lakes in an elongated 'S' pattern along the original stream path. Note especially the large landslides that created the dams. Some other landslides in this area, such as the large one in the northeast corner of the image, occur only on the mountain slopes, so do not block streams, and do not form lakes.

  2. Earthquake Protection Measures for People with Disabilities

    Science.gov (United States)

    Gountromichou, C.; Kourou, A.; Kerpelis, P.

    2009-04-01

    The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

  3. Lncident: A Tool for Rapid Identification of Long Noncoding RNAs Utilizing Sequence Intrinsic Composition and Open Reading Frame Information

    Directory of Open Access Journals (Sweden)

    Siyu Han

    2016-01-01

    Full Text Available More and more studies have demonstrated that long noncoding RNAs (lncRNAs play critical roles in diversity of biological process and are also associated with various types of disease. How to rapidly identify lncRNAs and messenger RNA is the fundamental step to uncover the function of lncRNAs identification. Here, we present a novel method for rapid identification of lncRNAs utilizing sequence intrinsic composition features and open reading frame information based on support vector machine model, named as Lncident (LncRNAs identification. The 10-fold cross-validation and ROC curve are used to evaluate the performance of Lncident. The main advantage of Lncident is high speed without the loss of accuracy. Compared with the exiting popular tools, Lncident outperforms Coding-Potential Calculator, Coding-Potential Assessment Tool, Coding-Noncoding Index, and PLEK. Lncident is also much faster than Coding-Potential Calculator and Coding-Noncoding Index. Lncident presents an outstanding performance on microorganism, which offers a great application prospect to the analysis of microorganism. In addition, Lncident can be trained by users’ own collected data. Furthermore, R package and web server are simultaneously developed in order to maximize the convenience for the users. The R package “Lncident” can be easily installed on multiple operating system platforms, as long as R is supported.

  4. The research and application of earthquake disaster comprehensive evaluation

    Science.gov (United States)

    Guo, Hongmei; Chen, Weifeng

    2017-04-01

    All disaster relief operations of the government after a destructive earthquake are dependent on earthquake disaster information, including command decision、rescue force deployment、dispatch of relief supplies etc. Earthquake disaster information is the most important requirements during earthquake emergency response and emergency disposal period. The macro disaster information, including distribution of disaster area 、personnel casualty scale etc,determines the disaster relief scale and response level. The specific disaster information determines the process and details of specific rescue operations. In view of the importance of earthquake disaster information, experts have been devoted to the study of seismic hazard assessment and acquisition, mainly from two aspects: improving the pre-assessment accuracy of the disaster and enriching the disaster information acquisition means. The problem is that the experts have carried out in-depth research from a certain aspect, they usually focus on optimizing pre-evaluation method、refining and updating basic data、 establishing new disaster information access channels, while ignoring the comprehensive use of various methods and means。 According to several devastating earthquake emergency disposal experience of sichuan province in recent years, this paper presents a new earthquake disaster comprehensive evaluation technology, in which Multi-disaster information source coordination, multi-faceted research field expert's complementarity coordination, rear and on-site coordination, multi-sectoral multi-regional coordination were taken into account. On this basis, Earthquake disaster comprehensive evaluation system with expert experience has been established. Based on the pre-assessment, the system can combine the background information of the disaster area such as seismic geological background and socioeconomic backgrounds, with disaster information from various sources to realize the fusion and mining of multi

  5. Earthquake detection through computationally efficient similarity search.

    Science.gov (United States)

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes.

  6. Do weak global stresses synchronize earthquakes?

    Science.gov (United States)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  7. SUPPLEMENTARY INFORMATION RAPID COMMUNICATION ...

    Indian Academy of Sciences (India)

    Sulfated polyborate: A mild, efficient catalyst for synthesis of N-tert- butyl/N-trityl protected amides via Ritter reaction. KRISHNA S INDALKAR, CHETAN K KHATRI and GANESH U CHATURBHUJ*. Department of Pharmaceutical Sciences and Technology, Institute of Chemical Technology, Mumbai, Maharashtra 400 019 ...

  8. [Comparative analysis of the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquakes].

    Science.gov (United States)

    Shi, Xiao-Jun; Wang, Guang-Lin; Pei, Fu-Xing; Song, Yue-Ming; Yang, Tian-Fu; Tu, Chong-Qi; Huang, Fu-Guo; Liu, Hao; Lin, Wei

    2013-10-18

    To systematically analyze and compare the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquake, so as to provide useful references for future earthquakes injury rescue. Based on the orthopedic inpatients in Lushan and Wenchuan earthquakes, the data of the age, gender, injury causes, body injured parts and speed of transport were classified and compared. The duration of patients admitted to hospital lasted long and the peak appeared late in Wenchuan earthquake, which is totally opposed to Lushan earthquake. There was no significant difference in the patient's age and gender between the two earthquakes. However, the occurrence rate of crush syndrome, amputation, gas gangrene, vascular injury and multiple organ dysfunction syndrome (MODS) in Wenchuan earthquake was much higher than that in Lushan earthquake. Blunt traumas or crush-related injuries (79.6%) are the major injury cause in Wenchuan earthquake, however, high falling injuries and falls (56.8%) are much higher than blunt trauma or crush-related injuries (39.2%) in Lushan earthquake. The incidence rate of foot fractures, spine fractures and multiple fractures in Lushan earthquake was higher than that in Wenchuan earthquake, but that of open fractures and lower limb fractures was lower than that in Wenchuan earthquake. The rapid rescue scene is the cornerstone of successful treatment, early rescue and transport obviously reduce the incidence of the wound infection, crush syndrome, MODS and amputation. Popularization of correct knowledge of emergency shelters will help to reduce the damage caused by blindly jumping or escaping while earthquake happens.

  9. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  10. Environmental risk prioritization and management of petroleum retail facilities using Geographic Information Systems (GIS) and rapid field investigation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Drake, J.T.; Thomas, C.A.; Molloy, K.P. [Camp Dresser & McKee Inc., Cambridge, MA (United States)] [and others

    1997-12-31

    Environmental risk prioritization is a method of estimating the probability and severity of an environmental impact related to a petroleum facility. The risk analysis is directed at prioritizing environmental risks associated with each facility and so that capital expenditures can be directed toward mitigation at high risk sites and monitoring or stewardship at low risk locations. The expenditures or investments may include facility upgrades, waste disposal or containment, soil and water remediation, or facility monitoring activities. The risk analyses and petroleum retail facilities prioritization system presented in this paper includes a geographic information systems (GIS) and rapid field investigation techniques. This system was recently used in the prioritization of 150 petroleum facilities in Caracas, Venezuela and is currently being expanded to include additional facilities. The prioritization system used GIS and a data management system to quantify the Potential Impact Factors (PIF) and the Leak Likelihood Factors (LLF) associated with each facility. The PIF analysis used GIS to perform a proximity analysis; a radial search was performed for each site identifying proximal receptors including sensitive environmental receptors, water supply well locations, protected open spaces, water supply watersheds, areas of critical environmental concern, and residential populations. The LLF analysis assessed risk based on the age of the underground storage tanks, presence of leak protection devices, soils and groundwater conditions, tank and piping characteristics, sales volume and proximity to sources of high voltage. Prioritization weighting was placed on each of the LLF factors based on the risk associated with each. After the initial GIS site rankings, rapid site audits were completed selected the high risk sites. These audits included compilation of spill histories and soil vapor analyses using gas chromatography.

  11. Prediction of earthquakes: a data evaluation and exchange problem

    Energy Technology Data Exchange (ETDEWEB)

    Melchior, Paul

    1978-11-15

    Recent experiences in earthquake prediction are recalled. Precursor information seems to be available from geodetic measurements, hydrological and geochemical measurements, electric and magnetic measurements, purely seismic phenomena, and zoological phenomena; some new methods are proposed. A list of possible earthquake triggers is given. The dilatancy model is contrasted with a dry model; they seem to be equally successful. In conclusion, the space and time range of the precursors is discussed in relation to the magnitude of earthquakes. (RWR)

  12. Report on the 2010 Chilean earthquake and tsunami response

    Science.gov (United States)

    ,

    2011-01-01

    disaster response strategies and operations of Chilean agencies, including perceived or actual failures in disaster preparation that impacted the medical disaster response; post-disaster health and medical interventions to save lives and limit suffering; and the lessons learned by public health and medical personnel as a result of their experiences. Despite devastating damage to the health care and civic infrastructure, the health care response to the Chilean earthquake appeared highly successful due to several factors. Like other first responders, the medical community had the ability and resourcefulness to respond without centralized control in the early response phase. The health care community maintained patient care under austere conditions, despite many obstacles that could have prevented such care. National and international resources were rapidly mobilized to support the medical response. The Emergency Services Team sought to collect information on all phases of emergency management (preparedness, mitigation, response, and recovery) and determine what worked well and what could be improved upon. The Chileans reported being surprised that they were not as ready for this event as they thought they were. The use of mass care sheltering was limited, given the scope of the disaster, because of the resiliency of the population. The impacts of the earthquake and the tsunami were quite different, as were the needs of urban and rural dwellers, necessitating different response activities. The Volunteer Services Team examined the challenges faced in mobilizing a large number of volunteers to assist in the aftermath of a disaster of this scale. One of the greatest challenges expressed was difficulty in communication; the need for redundancy in communication mechanisms was cited. The flexibility and ability to work autonomously by the frontline volunteers was a significant factor in effective response. It was also important for volunteer leadership to know the emergency plans

  13. 2016 update on induced earthquakes in the United States

    Science.gov (United States)

    Petersen, Mark D.

    2016-01-01

    During the past decade people living in numerous locations across the central U.S. experienced many more small to moderate sized earthquakes than ever before. This earthquake activity began increasing about 2009 and peaked during 2015 and into early 2016. For example, prior to 2009 Oklahoma typically experienced 1 or 2 small earthquakes per year with magnitude greater than 3.0 but by 2015 this number rose to over 900 earthquakes per year of that size and over 30 earthquakes greater than 4.0. These earthquakes can cause damage. In 2011 a magnitude 5.6 earthquake struck near the town of Prague, Oklahoma on a preexisting fault and caused severe damage to several houses and school buildings. During the past 6 years more than 1500 reports of damaging shaking levels were reported in areas of induced seismicity. This rapid increase and the potential for damaging ground shaking from induced earthquakes caused alarm to about 8 million people living nearby and officials responsible for public safety. They wanted to understand why earthquakes were increasing and the potential threats to society and buildings located nearby.

  14. Using the Charleston, SC earthquake of 1886 to develop new models for estimating future earthquake damage

    Science.gov (United States)

    Miner, Krystle Sunbow

    In 1886, a magnitude 7.3 earthquake occurred near Charleston, South Carolina. This earthquake produced extensive damage in the city and surrounding region. Because earthquake damage on the Charleston Peninsula is so well documented, there is an opportunity to use this information to develop better models of damage in future earthquakes. This project uses GIS and HAZUS to test the usefulness of new geological, geotechnical, and seismological data in describing actual building damage in 1886. A detailed catalog of building damage was mapped spatially in order to make comparisons to surface geology, elevation, fill depth, liquefaction susceptibility, soil type, and site amplitude within a small area at the southern tip of the Charleston Peninsula. Despite a large sample size of structures included in the area of interest, we do not have records for buildings in the most vulnerable regions of the study area. More information is needed to fill in the holes so that correlations can be resolved with confidence. Since the timing of an earthquake is unpredictable, preemptive preparations are needed so a city can best survive the earthquake. Therefore, the results of this study can be used to make city-planning recommendations and enhance community awareness programs so that Charleston can better prepare itself for the next large earthquake.

  15. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

    Science.gov (United States)

    van Westen, Cees; Gorum, Tolga

    2010-05-01

    This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

  16. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    Science.gov (United States)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (EDO in these earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  17. Temporal stress changes caused by earthquakes: A review

    Science.gov (United States)

    Hardebeck, Jeanne L.; Okada, Tomomi

    2018-01-01

    Earthquakes can change the stress field in the Earth’s lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth’s crust at plate boundaries is “strong” or “weak.” Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  18. Rapid scenarios and observed intensities

    Directory of Open Access Journals (Sweden)

    Franco Pettenati

    2012-10-01

    Full Text Available After a destructive earthquake, national Governments need to know the approximate amount of damage, the number of casualties, and the financial losses as soon as possible. Rapid scenarios are also used to inform the general public; see the widely used Shakemap package [Wald et al. 1999, 2006] of the US Geological Survey (USGS and the one modified by the Istituto Nazionale di Geofisica e Vulcanologia (INGV; National Institute of Geophysics and Volcanology, which is reproduced for Figure 1. The general matter of the use of intensities in damage scenarios was discussed in a special session at the 2008 Annual Meeting of the Seismological Society of America (http://www.seismosoc.org/meetings/2008/specialsessions.html, and was also discussed in the NIS-1 session of the European Congress in Moscow, in August 2012 (http://www.esc2012-moscow.org/esc_thematicareas.html. The purposes of the present report are to: (i compare different types of intensities; (ii check two rapid scenarios of intensity; and (iii understand whether the KF formula [Sirovich 1996, Sirovich et al. 2009] can be used as a new 'attenuation' relationship to improve rapid scenarios. […

  19. Matching time and spatial scales of rapid solidification: dynamic TEM experiments coupled to CALPHAD-informed phase-field simulations

    Science.gov (United States)

    Perron, Aurelien; Roehling, John D.; Turchi, Patrice E. A.; Fattebert, Jean-Luc; McKeown, Joseph T.

    2018-01-01

    A combination of dynamic transmission electron microscopy (DTEM) experiments and CALPHAD-informed phase-field simulations was used to study rapid solidification in Cu–Ni thin-film alloys. Experiments—conducted in the DTEM—consisted of in situ laser melting and determination of the solidification kinetics by monitoring the solid–liquid interface and the overall microstructure evolution (time-resolved measurements) during the solidification process. Modelling of the Cu–Ni alloy microstructure evolution was based on a phase-field model that included realistic Gibbs energies and diffusion coefficients from the CALPHAD framework (thermodynamic and mobility databases). DTEM and post mortem experiments highlighted the formation of microsegregation-free columnar grains with interface velocities varying from ∼0.1 to ∼0.6 m s‑1. After an ‘incubation’ time, the velocity of the planar solid–liquid interface accelerated until solidification was complete. In addition, a decrease of the temperature gradient induced a decrease in the interface velocity. The modelling strategy permitted the simulation (in 1D and 2D) of the solidification process from the initially diffusion-controlled to the nearly partitionless regimes. Finally, results of DTEM experiments and phase-field simulations (grain morphology, solute distribution, and solid–liquid interface velocity) were consistent at similar time (μs) and spatial scales (μm).

  20. Interpretation of earthquake-induced landslides triggered by the 12 May 2008, M7.9 Wenchuan earthquake in the Beichuan area, Sichuan Province, China using satellite imagery and Google Earth

    Science.gov (United States)

    Sato, H.P.; Harp, E.L.

    2009-01-01

    The 12 May 2008 M7.9 Wenchuan earthquake in the People's Republic of China represented a unique opportunity for the international community to use commonly available GIS (Geographic Information System) tools, like Google Earth (GE), to rapidly evaluate and assess landslide hazards triggered by the destructive earthquake and its aftershocks. In order to map earthquake-triggered landslides, we provide details on the applicability and limitations of publicly available 3-day-post- and pre-earthquake imagery provided by GE from the FORMOSAT-2 (formerly ROCSAT-2; Republic of China Satellite 2). We interpreted landslides on the 8-m-resolution FORMOSAT-2 image by GE; as a result, 257 large landslides were mapped with the highest concentration along the Beichuan fault. An estimated density of 0.3 landslides/km2 represents a minimum bound on density given the resolution of available imagery; higher resolution data would have identified more landslides. This is a preliminary study, and further study is needed to understand the landslide characteristics in detail. Although it is best to obtain landslide locations and measurements from satellite imagery having high resolution, it was found that GE is an effective and rapid reconnaissance tool. ?? 2009 Springer-Verlag.

  1. Did you feel it? : citizens contribute to earthquake science

    Science.gov (United States)

    Wald, David J.; Dewey, James W.

    2005-01-01

    Since the early 1990s, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such “Community Internet Intensity Maps” (CIIMs) contribute greatly toward the quick assessment of the scope of an earthquake emergency and provide valuable data for earthquake research.

  2. Real-time earthquake monitoring using a search engine method.

    Science.gov (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  3. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  4. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  5. Demand surge following earthquakes

    Science.gov (United States)

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  6. The Future of USGS Earthquake Geodesy

    Science.gov (United States)

    Hudnut, K. W.; King, N. E.; Murray-Moraleda, J.; Roeloffs, E.; Zeng, Y.

    2008-05-01

    Earthquake Geodesy, an important third prong of USGS Earthquake Hazards Program (EHP) along with seismology and geology, is at a crossroads. Initiatives by NASA and NSF have built global and national geodetic arrays that promise to contribute greatly to the EHP mission of helping to reduce the nation's loss of life and property from earthquakes. These geodetic arrays pose great opportunities and challenges for USGS scientists who now operate under tight constraints of either a flat or, at best, a moderately increasing budget. While availability of vast new data streams represents a great opportunity for USGS, the challenge is how to best exploit new data streams for risk mitigation and loss reduction. Geodetic data need to be fully embedded into the suite of USGS products, from the National Seismic Hazard Maps for long-term planning to ShakeMaps for rapid response. The USGS needs to be in a position to authoritatively review all geodetic data being collected nationwide (notably including those of the Plate Boundary Observatory) so that we can fulfill our Stafford Act responsibility of advising public officials on earthquake hazard issues in large urban areas and diverse geographic regions. Furthermore, USGS has the mandate and liability protection required to take the lead on Earthquake Early Warning (EEW) system development and implementation, in which geodesy may provide vital independent measurement methods in real-time so as to improve overall EEW system robustness.

  7. HAZGRIDX: earthquake forecasting model for ML≥ 5.0 earthquakes in Italy based on spatially smoothed seismicity

    Directory of Open Access Journals (Sweden)

    Aybige Akinci

    2010-11-01

    Full Text Available We present a five-year, time-independent, earthquake-forecast model for earthquake magnitudes of 5.0 and greater in Italy using spatially smoothed seismicity data. The model is called HAZGRIDX, and it was developed based on the assumption that future earthquakes will occur near locations of historical earthquakes; it does not take into account any information from tectonic, geological, or geodetic data. Thus HAZGRIDX is based on observed earthquake occurrence from seismicity data, without considering any physical model. In the present study, we calculate earthquake rates on a spatial grid platform using two declustered catalogs: 1 the Parametric catalog of Italian earthquakes (Catalogo Parametrico dei Terremoti Italiani, CPTI04 that contains the larger earthquakes from MW 7.0 since 1100; and 2 the Italian seismicity catalogue (Catalogo della Sismicità Italiana, CSI 1.1 that contains the small earthquakes down to ML 1.0, with a maximum of ML 5.9, over the past 22 years (1981-2003. The model assumes that earthquake magnitudes follow the Gutenberg-Richter law, with a uniform b-value. The forecast rates are presented in terms of the expected numbers of ML>5.0 events per year for each grid cell of about 10 km × 10 km. The final map is derived by averaging the earthquake potentials that come from these two different catalogs: CPTI04 and CSI 1.1. We also describe the earthquake occurrences in terms of probabilities of occurrence of one event within a specified magnitude bin, DM0.1, in a five year time period. HAZGRIDX is one of several forecasting models, scaled to five and ten years, that have been submitted to the Collaboratory for the Study of Earthquake Probability (CSEP forecasting center in ETH, Zurich, to be tested for Italy.

  8. The USGS Earthquake Scenario Project

    Science.gov (United States)

    Wald, D. J.; Petersen, M. D.; Wald, L. A.; Frankel, A. D.; Quitoriano, V. R.; Lin, K.; Luco, N.; Mathias, S.; Bausch, D.

    2009-12-01

    delivered via the EHP web pages in a user-searchable archive. In addition, we aim to duplicate most of the real-time earthquake event web page functionality for scenario drills and exercises, including all standard post-earthquake information tools. Hence, for each event, USGS PAGER runs will be produced, providing population exposure at current population levels, and Federal Emergency Management Agency (FEMA) will produce HAZUS impact assessments. Anticipated users include FEMA, the loss modeling and insurance communities, emergency responders and mitigation planners (city, county, state, industry, utilities, corporate), the general public and the media. The Earthquake Scenario Project will also take on several pending scientific challenges related to scenario generation, including ways to include fault directivity, numerical ground motions, and ways to produce ground motion uncertainties (in addition to median peak ground motions). A parallel though less comprehensive effort is underway to produce scenarios for targeted regions and events around the globe.

  9. The earthquake/seismic risk, vulnerability and capacity profile for ...

    African Journals Online (AJOL)

    The study was carried out to understand the risks posed by earthquakes in Karonga based on roles and perception of stakeholders. Information was collected from several stakeholders who were found responding to earthquakes impacts in Karonga Town. The study found that several stakeholders, governmental and ...

  10. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  11. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications

    Science.gov (United States)

    Wu, Stephen

    Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that

  12. Ultra Low Frequency (ULF European multi station magnetic field analysis before and during the 2009 earthquake at L'Aquila regarding regional geotechnical information

    Directory of Open Access Journals (Sweden)

    G. Prattes

    2011-07-01

    Full Text Available This work presents ground based Ultra Low Frequency (ULF magnetic field measurements in the frequency range from 10–15 mHz from 1 January 2008 to 14 April 2009. In this time period a strong earthquake series hit the Italian Abruzzo region around L'Aquila with the main stroke of magnitude M = 6.3 on 6 April 2009. In the frame of the South European Geomagnetic Array (SEGMA, a European collaboration runs ULF fluxgate instruments providing continuously magnetic field data recorded in mid- and south Europe. The main scientific objective is the investigation of signal variations due to seismic activity and the discrimination between other natural and human influences. The SEGMA station closest to the L'Aquila earthquake epicenter is L'Aquila observatory located in the epicenter region. For the scientific analysis we extract the nighttime period from 22:00–02:00 UT and determine the power spectral density (PSD of the horizontal (H and vertical (Z magnetic field components and the standardized polarization ratio (Z over (H. To discriminate local emissions from global geomagnetic effects, data from three SEGMA stations in distances up to 630 km from the epicenter region are analyzed and further compared to the independent global geomagnetic ∑ Kp index. Apart from indirect ionospheric effects, electromagnetic noise could be originated in the lithosphere due to tectonic mechanisms in the earthquake focus. To estimate the amplitude of assumed lithospheric electromagnetic noise emissions causing anomalies in the PSD of the (Z component, we consider magnetotelluric calculations of the electric crust conductivity in the L'Aquila region. Results found at L'Aquila observatory are interpreted with respect to the lithosphere electrical conductivity in the local observatory region, the ∑ Kp index, and further in a multi station analysis. Possible seismic related ULF

  13. Focal Mechanism of a Catastrophic Earthquake of the Last Rococo Period (1783) in Southern Italy Retrieved by Inverting Historical Information on Damage

    Science.gov (United States)

    Sirovich, L.; Pettenati, F.

    2007-05-01

    Using geophysical inversion to discover the fault source of a blind earthquake, that took place before the invention of the seismograph, seemed impossible. We demonstrated that sometimes it is possible using our simplified KF model (Sirovich, 1996) through automatic genetic inversion (Gentile et al., 2004 in BSSA; Sirovich and Pettenati, 2004 in JGR), and determined it conclusively by treating the Coalinga 1983, Loma Prieta 1989, and Northridge 1994 earthquakes (Pettenati and Sirovich, 2007 in BSSA). KF is able to simulate the body-wave radiation from a linear source, and eleven source parameters are retrieved: the three nucleation coordinates, the fault-plane solution, the seismic moment, the rupture velocities and lengths along-strike and anti-strike, the shear wave velocity in the half-space. To find the minima on the hypersurface of the residuals in the multi-parameter model space, we use a genetic process with niching since we have already shown that the problem is bimodal for pure dip-slip mechanisms. The objective function of the nonlinear inversion is the sum of the squared residuals (calculated-minus-observed intensity at all sites). Here, we use the very good intensity data provided in the MCS scale by the INGV of Italy for the M 6.9 earthquake of Feb. 5, 1783 (see the Italian intensity data bank on http:emidius.mi.ingv.it/DOM/consultazione.html). The data of 1783 were created by seismologists and historians who interpreted the reports of the time and many other historical sources. Given the limitations of the KF approach, we limited our inversion to a square area of 200 by 200 km around the most heavily damaged zone. 341 surveyed towns and hamlets received intensity degrees by INGV (we discarded 6 of them as statistical outliers according to the classical Chauvenet method). Thus, 335 data were inverted. The match between experimental and synthetic isoseismals is really noteworthy. The found mechanism is almost pure dip-slip and, thus, the problem is

  14. Bridging the gap between science and practice in seismology -Lessons from the 2011 Tohoku-oki earthquake-

    Science.gov (United States)

    Kanamori, H.

    2011-12-01

    , very close to the trench. The slip in this zone is relatively slow without strong excitation of short-period seismic waves, a characteristic of tsunami earthquakes. In contrast, strong short-period radiation occurred from a down-dip portion of the mega-thrust. Tomographic studies using the data both before and after the 2011 earthquake indicate a high-velocity structure which coincides approximately with the zone of large slip near the trench. In retrospect, we believe that our understanding of the overall geophysical framework of the region is correct, and the seismological research in this region has been in the right direction, but the inevitable lack of information on the details of plate boundary structure near the trench prevented us from recognizing the severity of the threat. The efforts to utilize the research products in practice need to be significantly enhanced to protect our society from this type of rare and devastating event. Seismology has made a good progress in determining what has happened very rapidly, and can provide a good geophysical framework. Thus, an effective approach to hazard mitigation is to take advantage of these strengths of seismology. Unfortunately, seismology cannot make precise predictions of what will happen on short time scales, at least at present.

  15. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  16. The Great East Japan Earthquake Disaster and cardiovascular diseases.

    Science.gov (United States)

    Aoki, Tatsuo; Fukumoto, Yoshihiro; Yasuda, Satoshi; Sakata, Yasuhiko; Ito, Kenta; Takahashi, Jun; Miyata, Satoshi; Tsuji, Ichiro; Shimokawa, Hiroaki

    2012-11-01

    While previous studies reported a short-term increase in individual cardiovascular disease (CVD) after great earthquakes, mid-term occurrences of all types of CVDs after great earthquakes are unknown. We addressed this important issue in our experience with the Great East Japan Earthquake (11 March 2011). We retrospectively examined the impact of the Earthquake on the occurrences of CVDs and pneumonia by comparing the ambulance records made by doctors in our Miyagi Prefecture, the centre of the disaster area, during the periods of 2008-11 (n = 124,152). The weekly occurrences of CVDs, including heart failure (HF), acute coronary syndrome (ACS), stroke, cardiopulmonary arrest (CPA), and pneumonia were all significantly increased after the Earthquake compared with the previous 3 years. The occurrences of ACS and CPA showed the rapid increase followed by a sharp decline, whereas those of HF and pneumonia showed a prolonged increase for more than 6 weeks and those of stroke and CPA showed a second peak after the largest aftershock (7 April 2011). Furthermore, the occurrence of CPA was increased in the first 24 h after the Earthquake, followed by other diseases later on. These increases were independent of age, sex, or residence area (seacoast vs. inland). These results indicate that the occurrences of all types of CVDs and pneumonia were increased in somewhat different time courses after the Earthquake, including the first observation of the marked and prolonged increase in HF, emphasizing the importance of intensive medical management of all types of CVDs after great earthquakes.

  17. Toward tsunami early warning system in Indonesia by using rapid rupture durations estimation

    Energy Technology Data Exchange (ETDEWEB)

    Madlazim [Physics Department, Faculty Mathematics and Sciences of Surabaya State University (UNESA) Jl. Ketintang, Surabaya 60231 (Indonesia)

    2012-06-20

    Indonesia has Indonesian Tsunami Early Warning System (Ina-TEWS) since 2008. The Ina-TEWS has used automatic processing on hypocenter; Mwp, Mw (mB) and Mj. If earthquake occurred in Ocean, depth < 70 km and magnitude > 7, then Ina-TEWS announce early warning that the earthquake can generate tsunami. However, the announcement of the Ina-TEWS is still not accuracy. Purposes of this research are to estimate earthquake rupture duration of large Indonesia earthquakes that occurred in Indian Ocean, Java, Timor sea, Banda sea, Arafura sea and Pasific ocean. We analyzed at least 330 vertical seismogram recorded by IRIS-DMC network using a direct procedure for rapid assessment of earthquake tsunami potential using simple measures on P-wave vertical seismograms on the velocity records, and the likelihood that the high-frequency, apparent rupture duration, T{sub dur}. T{sub dur} can be related to the critical parameters rupture length (L), depth (z), and shear modulus ({mu}) while T{sub dur} may be related to wide (W), slip (D), z or {mu}. Our analysis shows that the rupture duration has a stronger influence to generate tsunami than Mw and depth. The rupture duration gives more information on tsunami impact, Mo/{mu}, depth and size than Mw and other currently used discriminants. We show more information which known from the rupture durations. The longer rupture duration, the shallower source of the earthquake. For rupture duration greater than 50 s, the depth less than 50 km, Mw greater than 7, the longer rupture length, because T{sub dur} is proportional L and greater Mo/{mu}. Because Mo/{mu} is proportional L. So, with rupture duration information can be known information of the four parameters. We also suggest that tsunami potential is not directly related to the faulting type of source and for events that have rupture duration greater than 50 s, the earthquakes generated tsunami. With available real-time seismogram data, rapid calculation, rupture duration discriminant

  18. Earthquakes and emergence

    Science.gov (United States)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  19. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  20. The impacts of climate change on poverty in 2030, and the potential from rapid, inclusive and climate-informed development

    Science.gov (United States)

    Rozenberg, J.; Hallegatte, S.

    2016-12-01

    There is a consensus on the fact that poor people are more vulnerable to climate change than the rest of the population, but, until recently, few quantified estimates had been proposed and few frameworks existed to design policies for addressing the issue. In this paper, we analyze the impacts of climate change on poverty using micro-simulation approaches. We start from household surveys that describe the current distribution of income and occupations, we project these households into the future and we look at the impacts of climate change on people's income. To project households into the future, we explore a large range of assumptions on future demographic changes (including on education), technological changes, and socio-economic trends (including redistribution policies). This approach allows us to identify the main combination of factors that lead to fast poverty reduction, and the ones that lead to high climate change impacts on the poor. Identifying these factors is critical for designing efficient policies to protect the poorest from climate change impacts and making economic growth more inclusive. Conclusions are twofold. First, by 2030 climate change can have a large impact on poverty, with between 3 and 122 million more people in poverty, but climate change remains a secondary driver of poverty trends within this time horizon. Climate change impacts do not only affect the poorest: in 2030, the bottom 40 percent lose more than 4 percent of income in many countries. The regional hotspots are Sub-Saharan Africa and - to a lesser extent - India and the rest of South Asia. The most important channel through which climate change increases poverty is through agricultural income and food prices. Second, by 2030 and in the absence of surprises on climate impacts, inclusive climate-informed development can prevent most of (but not all) the impacts on poverty. In a scenario with rapid, inclusive and climate-proof development, climate change impact on poverty is

  1. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    Science.gov (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  2. The earthquake disaster risk characteristic and the problem in the earthquake emergency rescue of mountainous southwestern Sichuan

    Science.gov (United States)

    Yuan, S.; Xin, C.; Ying, Z.

    2016-12-01

    In recent years, earthquake disaster occurred frequently in Chinese mainland, the secondary disaster which have been caused by it is more serious in mountainous region. Because of the influence of terrain and geological conditions, the difficulty of earthquake emergency rescue work greatly increased, rescue force is also urged. Yet, it has been studied less on earthquake emergency rescue in mountainous region, the research in existing equipment whether can meet the actual needs of local earthquake emergency rescue is poorly. This paper intends to discuss and solve these problems. Through the mountainous regions Ganzi and Liangshan states in Sichuan field research, we investigated the process of earthquake emergency response and the projects for rescue force after an earthquake, and we also collected and collated local rescue force based data. By consulting experts and statistical analyzing the basic data, there are mainly two problems: The first is about local rescue force, they are poorly equipped and lack in the knowledge of medical help or identify architectural structure. There are no countries to establish a sound financial investment protection mechanism. Also, rescue equipment's updates and maintenance; The second problem is in earthquake emergency rescue progress. In the complicated geologic structure of mountainous regions, traffic and communication may be interrupted by landslides and mud-rock flows after earthquake. The outside rescue force may not arrive in time, rescue equipment was transported by manpower. Because of unknown earthquake disaster information, the local rescue force was deployed unreasonable. From the above, the local government worker should analyze the characteristics of the earthquake disaster in mountainous regions, and research how to improve their earthquake emergency rescue ability. We think they can do that by strengthening and regulating the rescue force structure, enhancing the skills and knowledge, training rescue workers

  3. Real-Time Earthquake Intensity Estimation Using Streaming Data Analysis of Social and Physical Sensors

    Science.gov (United States)

    Kropivnitskaya, Yelena; Tiampo, Kristy F.; Qin, Jinhui; Bauer, Michael A.

    2017-06-01

    Earthquake intensity is one of the key components of the decision-making process for disaster response and emergency services. Accurate and rapid intensity calculations can help to reduce total loss and the number of casualties after an earthquake. Modern intensity assessment procedures handle a variety of information sources, which can be divided into two main categories. The first type of data is that derived from physical sensors, such as seismographs and accelerometers, while the second type consists of data obtained from social sensors, such as witness observations of the consequences of the earthquake itself. Estimation approaches using additional data sources or that combine sources from both data types tend to increase intensity uncertainty due to human factors and inadequate procedures for temporal and spatial estimation, resulting in precision errors in both time and space. Here we present a processing approach for the real-time analysis of streams of data from both source types. The physical sensor data is acquired from the U.S. Geological Survey (USGS) seismic network in California and the social sensor data is based on Twitter user observations. First, empirical relationships between tweet rate and observed Modified Mercalli Intensity (MMI) are developed using data from the M6.0 South Napa, CAF earthquake that occurred on August 24, 2014. Second, the streams of both data types are analyzed together in simulated real-time to produce one intensity map. The second implementation is based on IBM InfoSphere Streams, a cloud platform for real-time analytics of big data. To handle large processing workloads for data from various sources, it is deployed and run on a cloud-based cluster of virtual machines. We compare the quality and evolution of intensity maps from different data sources over 10-min time intervals immediately following the earthquake. Results from the joint analysis shows that it provides more complete coverage, with better accuracy and higher

  4. Injection-induced earthquakes

    National Research Council Canada - National Science Library

    Ellsworth, William L

    2013-01-01

    ...s. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids...

  5. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  6. Earthquakes and plate tectonics

    Science.gov (United States)

    Spall, H.

    1977-01-01

    The world's earthquakes are not randomly distributed over the Earth's surface. They tend to be concentrated in narrow zones. Why is this? And why are volcanoes and mountain ranges also found in these zones too?

  7. Tweet Earthquake Dispatch (TED)

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The USGS is offering earthquake alerts via two twitter accounts: @USGSted and @USGSBigQuakes. On average, @USGSted and @USGSBigQuakes will produce about one tweet...

  8. Earthquake Damage to Schools

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This set of slides graphically illustrates the potential danger that major earthquakes pose to school structures and to the children and adults who happen to be...

  9. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  10. Development of the U.S. Geological Survey's PAGER system (Prompt Assessment of Global Earthquakes for Response)

    Science.gov (United States)

    Wald, D.J.; Earle, P.S.; Allen, T.I.; Jaiswal, K.; Porter, K.; Hearne, M.

    2008-01-01

    The Prompt Assessment of Global Earthquakes for Response (PAGER) System plays a primary alerting role for global earthquake disasters as part of the U.S. Geological Survey’s (USGS) response protocol. We provide an overview of the PAGER system, both of its current capabilities and our ongoing research and development. PAGER monitors the USGS’s near real-time U.S. and global earthquake origins and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts. Current PAGER notifications and Web pages estimate the population exposed to each seismic intensity level. In addition to being a useful indicator of potential impact, PAGER’s intensity/exposure display provides a new standard in the dissemination of rapid earthquake information. We are currently developing and testing a more comprehensive alert system that will include casualty estimates. This is motivated by the idea that an estimated range of possible number of deaths will aid in decisions regarding humanitarian response. Underlying the PAGER exposure and loss models are global earthquake ShakeMap shaking estimates, constrained as quickly as possible by finite-fault modeling and observed ground motions and intensities, when available. Loss modeling is being developed comprehensively with a suite of candidate models that range from fully empirical to largely analytical approaches. Which of these models is most appropriate for use in a particular earthquake depends on how much is known about local building stocks and their vulnerabilities. A first-order country-specific global building inventory has been developed, as have corresponding vulnerability functions. For calibrating PAGER loss models, we have systematically generated an Atlas of 5,000 ShakeMaps for significant global earthquakes during the last 36 years. For many of these, auxiliary earthquake source and shaking intensity data are also available. Refinements to the loss models are ongoing

  11. Lessons learned from the Japan earthquake and tsunami, 2011.

    Science.gov (United States)

    Fuse, Akira; Yokota, Hiroyuki

    2012-01-01

    On March 11, 2011, an earthquake occurred off the coast of Honshu, Japan. The quake was followed by a powerful tsunami that caused extensive damage to the east coast of the Tohoku and Kanto regions. This disaster destroyed the medical system in place and thus drastically reduced the ability of the healthcare system to handle the large number of casualties. During the initial response to this disaster, we participated in several types of outreach medical relief teams dispatched to the affected area from the day of the earthquake onwards. The ratio of persons injured to persons missing or dead for the 2011 Japan disaster (0.31: 5,994 to 19,371) was much lower than for the Indian Ocean Tsunami of 2004 in Thailand (1.01; 8,457 to 8,393) and for the Great Hanshin-Awaji Earthquake of 1995 in Japan (6.80; 43,792 to 6,437). The different ratios for the different types of disasters indicate that medical relief efforts in response to natural disasters should be tailored to the type of disaster to optimize the effectiveness of the response and prevent further deaths. From a medical viewpoint, unnecessary deaths must be prevented following natural disasters. Doing so requires appropriate information transmission and an understanding of the mission's overall and specific objectives: 1) rapid search and rescue; 2) early care in the field, evacuation centers, and primary clinics; 3) definitive evaluation at disaster base hospitals; and 4) proper evacuation to unaffected areas. We propose a descriptive device that can guide headquarters in dealing with the commonalities of a disaster.

  12. Seismology: energy radiation from the Sumatra earthquake.

    Science.gov (United States)

    Ni, Sidao; Kanamori, Hiroo; Helmberger, Don

    2005-03-31

    We determined the duration of high-frequency energy radiation from Indonesia's great Sumatra-Andaman earthquake (26 December 2004) to be about 500 seconds. This duration can be translated into a rupture length of about 1,200 km, which is more than twice as long as that inferred from body-wave analyses performed soon after the event. Our analysis was able rapidly to define the extent of rupture, thereby aiding the assessment of seismic hazard in the immediate future.

  13. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  14. Rupture Characteristics and Aftershocks of the July 15, 2003 Carlsberg `H' (Indian Ocean) Mw 7.6 Earthquake

    Science.gov (United States)

    Antolik, M.; Abercrombie, R. E.; Pan, J.; Ekstrom, G.

    2003-12-01

    The occurrence of a Mw 7.6 earthquake near the Carlsberg ridge (15 July, 2003) provides valuable information about earthquake rupture processes in oceanic lithosphere, which are not well understood, and the distributed deformation of the India-Australia plate. The earthquake had a strike-slip mechanism opposite to that of the transform faults on the ridge, and appears to have ruptured a fracture zone (designated as `H' by Royer et al., 1997) within the India-Australia composite plate. We examine the rupture characteristics of this earthquake using the full spectrum of seismic radiation. Inversion of the body waves indicates rapid rupture propagation toward the NE, away from the Carlsberg Ridge, for a distance of ˜200 km. The average rupture velocity is well constrained and is ˜3.6 km s-1. The total source duration is ˜60 s; however, nearly all of the moment release occurs in the last 30 s. The age of the lithosphere in the area of largest moment is release is 10-15 Ma. The body waves can be well fit with a simple rupture model and no jump of fracture zones is required, as has been suggested for some oceanic earthquakes (e.g., McGuire et al., 1996). The source process is very similar to the well-studied 1994 Mw 7.0 earthquake along the Romanche transform in the equatorial Atlantic (Abercrombie and Ekström, 2001). We also analyze the aftershock distribution using multiple-hypocenter relocation techniques and moment-tensor analysis using intermediate-period surface waves. Only 15 aftershocks (M > 4.5) are listed in the USGS catalog, which is typical of large oceanic earthquakes (e.g., Boettcher and Jordan, 2001). Moment tensors obtained from five of the aftershocks show a diversity of focal mechanisms. We interpret a cluster of aftershocks located at ˜1o S as representing extension which results from the stress field of the mainshock at the end of the rupture. This interpretation is consistent with the 200-km rupture length inferred from body waves. The focal

  15. The 2010 Haiti earthquake response.

    Science.gov (United States)

    Raviola, Giuseppe; Severe, Jennifer; Therosme, Tatiana; Oswald, Cate; Belkin, Gary; Eustache, Eddy

    2013-09-01

    This article presents an overview of the mental health response to the 2010 Haiti earthquake. Discussion includes consideration of complexities that relate to emergency response, mental health and psychosocial response in disasters, long-term planning of systems of care, and the development of safe, effective, and culturally sound mental health services in the Haitian context. This information will be of value to mental health professionals and policy specialists interested in mental health in Haiti, and in the delivery of mental health services in particularly resource-limited contexts in the setting of disasters. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Estimating the macroseismic parameters of earthquakes in eastern Iran

    Science.gov (United States)

    Amini, H.; Gasperini, P.; Zare, M.; Vannucci, G.

    2017-10-01

    Macroseismic intensity values allow assessing the macroseismic parameters of earthquakes such as location, magnitude, and fault orientation. This information is particularly useful for historical earthquakes whose parameters were estimated with low accuracy. Eastern Iran (56°-62°E, 29.5°-35.5°N), which is characterized by several active faults, was selected for this study. Among all earthquakes occurred in this region, only 29 have some macroseismic information. Their intensity values were reported in various intensity scales. After collecting the descriptions, their intensity values were re-estimated in a uniform intensity scale. Thereafter, Boxer method was applied to estimate their corresponding macroseismic parameters. Boxer estimates of macroseismic parameters for instrumental earthquakes (after 1964) were found to be consistent with those published by Global Centroid Moment Tensor Catalog (GCMT). Therefore, this method was applied to estimate location, magnitude, source dimension, and orientation of these earthquakes with macroseismic description in the period 1066-2012. Macroseismic parameters seem to be more reliable than instrumental ones not only for historical earthquakes but also for instrumental earthquakes especially for the ones occurred before 1960. Therefore, as final results of this study we propose to use the macroseismically determined parameters in preparing a catalog for earthquakes before 1960.

  17. Developing and Testing the Automated Post-Event Earthquake Loss Estimation and Visualisation (APE-ELEV) Technique

    OpenAIRE

    Astoul, Anthony; Filliter, Christopher; Mason,Eric; Rau-Chaplin, Andrew; Shridhar, Kunal; Varghese, Blesson; Varshney, Naman

    2013-01-01

    An automated, real-time, multiple sensor data source relying and globally applicable earthquake loss model and visualiser is desirable for post-event earthquake analysis. To achieve this there is a need to support rapid data ingestion, loss estimation and integration of data from multiple data sources and rapid visualisation at multiple geographic levels. In this paper, the design and development of the Automated Post-Event Earthquake Loss Estimation and Visualisation (APE-ELEV) system for re...

  18. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  19. Great-earthquake paleogeodesy and tsunamis of the past 2000 years at Alsea Bay, central Oregon coast, USA

    Science.gov (United States)

    Nelson, A.R.; Sawai, Y.; Jennings, A.E.; Bradley, L.-A.; Gerson, L.; Sherrod, B.L.; Sabean, J.; Horton, B.P.

    2008-01-01

    The width of plate-boundary fault rupture at the Cascadia subduction zone, a dimension related to earthquake magnitude, remains uncertain because of the lack of quantitative information about land-level movements during past great-earthquake deformation cycles. Beneath a marsh at Alsea Bay, on the central Oregon coast, four sheets of tsunami-deposited sand blanket contacts between tidal mud and peat. Radiocarbon ages for the sheets match ages for similar evidence of regional coseismic subsidence and tsunamis during four of Cascadia's great earthquakes. Barring rapid, unrecorded postseismic uplift, reconstruction of changes in land level from core samples using diatom and foraminiferal transfer functions includes modest coseismic subsidence (0.4??0.2 m) during the four earthquakes. Interpretation is complicated, however, by the 30-38% of potentially unreliable transfer function values from samples with poor analogs in modern diatom and foraminiferal assemblages. Reconstructions of coseismic subsidence using good-analog samples range from 0.46??0.12 to 0.09??0.20 m showing greater variability than implied by sample-specific errors. From apparent high rates of land uplift following subsidence and tsunamis, we infer that postseismic rebound caused by slip on deep parts of the plate boundary and (or) viscoelastic stress relaxation in the upper plate may be almost as large as coseismic subsidence. Modest coseismic subsidence 100 km landward of the deformation front implies that plate-boundary ruptures in central Oregon were largely offshore. Ruptures may have been long and narrow during earthquakes near magnitude 9, as suggested for the AD 1700 earthquake, or of smaller and more variable dimensions and magnitudes. ?? 2008 Elsevier Ltd. All rights reserved.

  20. An Ising model for earthquake dynamics

    Directory of Open Access Journals (Sweden)

    A. Jiménez

    2007-01-01

    Full Text Available This paper focuses on extracting the information contained in seismic space-time patterns and their dynamics. The Greek catalog recorded from 1901 to 1999 is analyzed. An Ising Cellular Automata representation technique is developed to reconstruct the history of these patterns. We find that there is strong correlation in the region, and that small earthquakes are very important to the stress transfers. Finally, it is demonstrated that this approach is useful for seismic hazard assessment and intermediate-range earthquake forecasting.

  1. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  2. A P-wave based, on-site method for Earthquake Early Warning

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    Can we rapidly predict the potential damage of earthquakes by-passing the estimation of its location and magnitude? One possible approach is to predict the expected peak ground shaking at the site and the earthquake magnitude from the initial P-peak amplitude and characteristic period, respectively. The idea, first developed by Wu and Kanamori (2005), is to combine the two parameters for declaring the alert once the real-time measured quantities have passed pre-defined thresholds. Our proposed on-site early warning method generalized this approach, based on the analysis of strong motion data from modern accelerograph networks in Japan, Taiwan and Italy (Zollo et al., 2010). It is based on the real-time measurement of the period (τc) and peak displacement (Pd) parameters at one or more co-located stations at a given target site to be protected against the earthquake effects. By converting these real-time proxies in predicted values of Peak Ground Velocity (PGV) or instrumental intensity (IMM) and magnitude, an alert level is issued at the recording site based on a decisional table with four entries defined upon threshold values of the parameters Pd and Tc. The latter ones are set according to the error bounds estimated on the derived prediction equations. A near-source network of stations running the onsite method can provide the event location and transmit the information about the alert levels recorded at near-source stations to more distant sites, before the arrival of the most destructive phase. The network-based approach allows for the rapid and robust estimation of the Potential Damage Zone (PDZ), that is the area where most of earthquake damage is expected (Colombelli et al., 2012). A new strategy for a P-wave based, on-site earthquake early warning system has been developed and tested on Japanese strong motion data and under testing on Italian data. The key elements are the real-time, continuous measurement of three peak amplitude parameters and their

  3. Historical earthquakes in Mexico. Past efforts andnew multidisciplinary achievements

    Directory of Open Access Journals (Sweden)

    V. García Acosta

    2004-06-01

    Full Text Available The 1985 Mexican earthquakes demonstrated that knowledge concerning their history was still scarce and precarious.In fact those earthquakes acted as triggers, because it was then when a new field of research began to develop: disaster historical research. An initial task was to retrieve the history of earthquakes in Mexico in order throughout to produce an exhaustive inventory. The main result was a paradigmatic catalogue, published some years ago as the book Los sismos en la historia de México (Earthquakes in Mexican History. It contains information about every event along 450 years of Mexican seismological history. This paper will focus on the background of this seismological compilation and its characteristics, addressing mainly methodological items concerning sources, qualitative and/or quantitative data, the importance of joint and multidisciplinary efforts, and the research they have inspired on historical earthquakes investigation in Mexico.

  4. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  5. a Collaborative Cyberinfrastructure for Earthquake Seismology

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  6. Study on Earthquake Emergency Evacuation Drill Trainer Development

    Science.gov (United States)

    ChangJiang, L.

    2016-12-01

    With the improvement of China's urbanization, to ensure people survive the earthquake needs scientific routine emergency evacuation drills. Drawing on cellular automaton, shortest path algorithm and collision avoidance, we designed a model of earthquake emergency evacuation drill for school scenes. Based on this model, we made simulation software for earthquake emergency evacuation drill. The software is able to perform the simulation of earthquake emergency evacuation drill by building spatial structural model and selecting the information of people's location grounds on actual conditions of constructions. Based on the data of simulation, we can operate drilling in the same building. RFID technology could be used here for drill data collection which read personal information and send it to the evacuation simulation software via WIFI. Then the simulation software would contrast simulative data with the information of actual evacuation process, such as evacuation time, evacuation path, congestion nodes and so on. In the end, it would provide a contrastive analysis report to report assessment result and optimum proposal. We hope the earthquake emergency evacuation drill software and trainer can provide overall process disposal concept for earthquake emergency evacuation drill in assembly occupancies. The trainer can make the earthquake emergency evacuation more orderly, efficient, reasonable and scientific to fulfill the increase in coping capacity of urban hazard.

  7. Rupture, waves and earthquakes

    Science.gov (United States)

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  8. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  9. Earthquake Damage to Transportation Systems

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Earthquakes represent one of the most destructive natural hazards known to man. A serious result of large-magnitude earthquakes is the disruption of transportation...

  10. Earthquake mechanism and seafloor deformation for tsunami generation

    Science.gov (United States)

    Geist, Eric L.; Oglesby, David D.; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan

    2014-01-01

    Tsunamis are generated in the ocean by rapidly displacing the entire water column over a significant area. The potential energy resulting from this disturbance is balanced with the kinetic energy of the waves during propagation. Only a handful of submarine geologic phenomena can generate tsunamis: large-magnitude earthquakes, large landslides, and volcanic processes. Asteroid and subaerial landslide impacts can generate tsunami waves from above the water. Earthquakes are by far the most common generator of tsunamis. Generally, earthquakes greater than magnitude (M) 6.5–7 can generate tsunamis if they occur beneath an ocean and if they result in predominantly vertical displacement. One of the greatest uncertainties in both deterministic and probabilistic hazard assessments of tsunamis is computing seafloor deformation for earthquakes of a given magnitude.

  11. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    Science.gov (United States)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  12. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  13. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  14. Earthquake history of Virginia

    Science.gov (United States)

    von Hake, C. A.

    1977-01-01

    Virginia is a State of considerable seismic activity, although the earthquakes are rarely strong. Thirty-five shocks, intensity MM V or greater (Modified Mercalli Scale), are listed with epicenters within its borders. The locations of several of the older events are not precise; thus, the above count i subject to alteration. A detailed study of Virginia earthquakes by G. A. Bollinger and M. G. Hopper of the Virginia Polytechnic Institute and State University listed 137 shocks (71 from 1774 to 1899, 66 from 1900 to 1970). Many of these were felt with intensities below MM V. 

  15. Deterministic and Nondeterministic Behavior of Earthquakes and Hazard Mitigation Strategy

    Science.gov (United States)

    Kanamori, H.

    2014-12-01

    Earthquakes exhibit both deterministic and nondeterministic behavior. Deterministic behavior is controlled by length and time scales such as the dimension of seismogenic zones and plate-motion speed. Nondeterministic behavior is controlled by the interaction of many elements, such as asperities, in the system. Some subduction zones have strong deterministic elements which allow forecasts of future seismicity. For example, the forecasts of the 2010 Mw=8.8 Maule, Chile, earthquake and the 2012 Mw=7.6, Costa Rica, earthquake are good examples in which useful forecasts were made within a solid scientific framework using GPS. However, even in these cases, because of the nondeterministic elements uncertainties are difficult to quantify. In some subduction zones, nondeterministic behavior dominates because of complex plate boundary structures and defies useful forecasts. The 2011 Mw=9.0 Tohoku-Oki earthquake may be an example in which the physical framework was reasonably well understood, but complex interactions of asperities and insufficient knowledge about the subduction-zone structures led to the unexpected tragic consequence. Despite these difficulties, broadband seismology, GPS, and rapid data processing-telemetry technology can contribute to effective hazard mitigation through scenario earthquake approach and real-time warning. A scale-independent relation between M0 (seismic moment) and the source duration, t, can be used for the design of average scenario earthquakes. However, outliers caused by the variation of stress drop, radiation efficiency, and aspect ratio of the rupture plane are often the most hazardous and need to be included in scenario earthquakes. The recent development in real-time technology would help seismologists to cope with, and prepare for, devastating tsunamis and earthquakes. Combining a better understanding of earthquake diversity and modern technology is the key to effective and comprehensive hazard mitigation practices.

  16. Implications of fault constitutive properties for earthquake prediction

    Science.gov (United States)

    Dieterich, J.H.; Kilgore, B.

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance D(c), apparent fracture energy at a rupture front, time- dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of D, apply to faults in nature. However, scaling of D(c) is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  17. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    very low rupture velocity. The low rupture velocity can mean slow-faulting, which brings to slow release of accumulated seismic energy. The slow release energy does principally little to moderate damages. Additionally wave form of the earthquake shows low frequency content of P-waves (the maximum P-wave is at 1.19 Hz) and the specific P- wave displacement spectral is characterise with not expressed spectrum plateau and corner frequency. These and other signs suggest us to the conclusion, that the 2012 Mw5.6 earthquake can be considered as types of slow earthquake, like a low frequency quake. The study is based on data from Bulgarian seismological network (NOTSSI), the local network (LSN) deployed around Kozloduy NPP and System of Accelerographs for Seismic Monitoring of Equipment and Structures (SASMES) installed in the Kozloduy NPP. NOTSSI jointly with LSN and SASMES provide reliable information for multiple studies on seismicity in regional scale.

  18. The Quake-Catcher Network: A Community-Led, Strong-Motion Network with Implications for Earthquake Advanced Alert

    Science.gov (United States)

    Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.; Jakka, R. S.; Chung, A. I.

    2009-12-01

    The goal of the Quake-Catcher Network (QCN) is to dramatically increase the number of strong-motion observations by exploiting recent advances in sensing technologies and cyberinfrastructure. Micro-Electro-Mechanical Systems (MEMS) triaxial accelerometers are very low cost (50-100), interface to any desktop computer via USB cable, and provide high-quality acceleration data. Preliminary shake table tests show the MEMS accelerometers can record high-fidelity seismic data and provide linear phase and amplitude response over a wide frequency range. Volunteer computing provides a mechanism to expand strong-motion seismology with minimal infrastructure costs, while promoting community participation in science. Volunteer computing also allows for rapid transfer of metadata, such as that used to rapidly determine the magnitude and location of an earthquake, from participating stations. QCN began distributing sensors and software to K-12 schools and the general public in April 2008 and has grown to roughly 1000 stations. Initial analysis shows metadata are received within 1-14 seconds from the observation of a trigger; the larger data latencies are correlated with greater server-station distances. Currently, we are testing a series of triggering algorithms to maximize the number of earthquakes captured while minimizing false triggers. We are also testing algorithms to automatically detect P- and S-wave arrivals in real time. Trigger times, wave amplitude, and station information are currently uploaded to the server for each trigger. Future work will identify additional metadata useful for quickly determining earthquake location and magnitude. The increased strong-motion observations made possible by QCN will greatly augment the capability of seismic networks to quickly estimate the location and magnitude of an earthquake for advanced alert to the public. In addition, the dense waveform observations will provide improved source imaging of a rupture in near-real-time. These

  19. Landslides and Earthquake Lakes from the Wenchuan, China Earthquake - Can it Happen in the U.S.?

    Science.gov (United States)

    Stenner, H.; Cydzik, K.; Hamilton, D.; Cattarossi, A.; Mathieson, E.

    2008-12-01

    The May 12, 2008 M7.9 Wenchuan, China earthquake destroyed five million homes and schools, causing over 87,650 deaths. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried homes, rock falls crushed cars, and landslides dammed rivers. Blocked roads greatly impeded emergency access, delaying response. Our August 2008 field experience in the affected area reminded us that the western United States faces serious risks posed by earthquake-induced landslides. The topography of the western U.S. is less extreme than that near Wenchuan, but earthquakes may still cause devastating landslides, damming rivers and blocking access to affected areas. After the Wenchuan earthquake, lakes rapidly rose behind landslide dams, threatening millions of lives. One landslide above Beichuan City created Tangjiashan Lake, a massive body of water upstream of Mianyang, an area with 5.2 million people, 30,000 of whom were killed in the quake. Potential failure of the landslide dam put thousands more people at risk from catastrophic flooding. In 1959, the M7.4 Hebgen Lake earthquake in Montana caused a large landslide, which killed 19 people and dammed the Madison River. The Army Corps excavated sluices to keep the dam from failing catastrophically. The Hebgen Lake earthquake ultimately caused 28 deaths, mostly from landslides, but the affected region was sparsely populated. Slopes prone to strong earthquake shaking and landslides in California, Washington, and Oregon have much larger populations at risk. Landslide hazards continue after the earthquake due to the effect strong shaking has on hillslopes, particularly when subjected to subsequent rain. These hazards must be taken into account. Once a landslide blocks a river, rapid and thoughtful action is needed. The Chinese government quickly and safely mitigated landslide dams that posed the greatest risk to people downstream. It took expert geotechnical advice, the speed and resources of the army

  20. Application of Astronomic Time-latitude Residuals in Earthquake Prediction

    Science.gov (United States)

    Yanben, Han; Lihua, Ma; Hui, Hu; Rui, Wang; Youjin, Su

    2007-04-01

    After the earthquake (Ms = 6.1) occurred in Luquan county of Yunnan province on April 18, 1985, the relationship between major earthquakes and astronomical time-latitude residuals (ATLR) of a photoelectric astrolabe in Yunnan Observatory was analyzed. ATLR are the rest after deducting the effects of Earth’s whole motion from the observations of time and latitude. It was found that there appeared the anomalies of the ATLR before earthquakes which happened in and around Yunnan, a seismic active region. The reason of the anomalies is possibly from change of the plumb line due to the motion of the groundmass before earthquakes. Afterwards, using studies of the anomalous characters and laws of ATLR, we tried to provide the warning information prior to the occurrence of a few major earthquakes in the region. The significant synchronous anomalies of ATLR of the observatory appeared before the earthquake of magnitude 6.2 in Dayao county of Yunnan province, on July 21, 2003. It has been again verified that the anomalies possibly provide the prediction information for strong earthquakes around the observatory.

  1. The Global Earthquake Model - Past, Present, Future

    Science.gov (United States)

    Smolka, Anselm; Schneider, John; Stein, Ross

    2014-05-01

    The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic

  2. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich

    1995-01-01

    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  3. Did you feel it? Community-made earthquake shaking maps

    Science.gov (United States)

    Wald, D.J.; Wald, L.A.; Dewey, J.W.; Quitoriano, Vince; Adams, Elisabeth

    2001-01-01

    Since the early 1990's, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey (USGS) and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such 'Community Internet Intensity Maps' (CIIM's) contribute greatly in quickly assessing the scope of an earthquake emergency, even in areas lacking seismic instruments.

  4. Keeping focus on earthquakes at school for seismic risk mitigation of the next generations

    Science.gov (United States)

    Saraò, Angela; Barnaba, Carla; Peruzza, Laura

    2013-04-01

    The knowledge of the seismic history of its own territory, the understanding of physical phenomena in response to an earthquake, the changes in the cultural heritage following a strong earthquake, the learning of actions to be taken during and after an earthquake, are piece of information that contribute to keep focus on the seismic hazard and to implement strategies for seismic risk mitigation. The training of new generations, today more than ever subject to rapid forgetting of past events, becomes therefore a key element to increase the perception that earthquakes happened and can happen at anytime and that mitigation actions are the only means to ensure the safety and to reduce damages and human losses. Since several years our institute (OGS) is involved in activities to raise awareness of education on earthquake. We aim to implement education programs with the goal of addressing a critical approach to seismic hazard reduction, differentiating the types of activities according to the age of the students. However, being such kind of activity unfunded, we can act at now only on a very limited number of schools per year. To be effective, the inclusion of the seismic risk issues in school curricula requires specific time and appropriate approaches when planning activities. For this reason, we involve also the teachers as proponents of activities and we encourage them to keep alive memories and discussion on earthquake in the classes. During the past years we acted mainly in the schools of the Friuli Venezia Giulia area (NE Italy), that is an earthquake prone area struck in 1976 by a destructive seismic event (Ms=6.5). We organized short training courses for teachers, we lectured classes, and we led laboratory activities with students. Indeed, being well known that students enjoy classes more when visual and active learning are joined, we propose a program that is composed by seminars, demonstrations and hands-on activities in the classrooms; for high school students

  5. Earthquake Safety Tips in the Classroom

    Science.gov (United States)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

    2014-12-01

    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  6. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    OpenAIRE

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  7. Rapid survey protocol that provides dynamic information on reef condition to managers of the Great Barrier Reef.

    Science.gov (United States)

    Beeden, R J; Turner, M A; Dryden, J; Merida, F; Goudkamp, K; Malone, C; Marshall, P A; Birtles, A; Maynard, J A

    2014-12-01

    Managing to support coral reef resilience as the climate changes requires strategic and responsive actions that reduce anthropogenic stress. Managers can only target and tailor these actions if they regularly receive information on system condition and impact severity. In large coral reef areas like the Great Barrier Reef Marine Park (GBRMP), acquiring condition and impact data with good spatial and temporal coverage requires using a large network of observers. Here, we describe the result of ~10 years of evolving and refining participatory monitoring programs used in the GBR that have rangers, tourism operators and members of the public as observers. Participants complete Reef Health and Impact Surveys (RHIS) using a protocol that meets coral reef managers' needs for up-to-date information on the following: benthic community composition, reef condition and impacts including coral diseases, damage, predation and the presence of rubbish. Training programs ensure that the information gathered is sufficiently precise to inform management decisions. Participants regularly report because the demands of the survey methodology have been matched to their time availability. Undertaking the RHIS protocol we describe involves three ~20 min surveys at each site. Participants enter data into an online data management system that can create reports for managers and participants within minutes of data being submitted. Since 2009, 211 participants have completed a total of more than 10,415 surveys at more than 625 different reefs. The two-way exchange of information between managers and participants increases the capacity to manage reefs adaptively, meets education and outreach objectives and can increase stewardship. The general approach used and the survey methodology are both sufficiently adaptable to be used in all reef regions.

  8. Rapid Microsatellite Marker Development Using Next Generation Pyrosequencing to Inform Invasive Burmese Python—Python molurus bivittatus—Management

    Directory of Open Access Journals (Sweden)

    Kristen M. Hart

    2013-02-01

    Full Text Available Invasive species represent an increasing threat to native ecosystems, harming indigenous taxa through predation, habitat modification, cross-species hybridization and alteration of ecosystem processes. Additionally, high economic costs are associated with environmental damage, restoration and control measures. The Burmese python, Python molurus bivittatus, is one of the most notable invasive species in the US, due to the threat it poses to imperiled species and the Greater Everglades ecosystem. To address population structure and relatedness, next generation sequencing was used to rapidly produce species-specific microsatellite loci. The Roche 454 GS-FLX Titanium platform provided 6616 di-, tri- and tetra-nucleotide repeats in 117,516 sequences. Using stringent criteria, 24 of 26 selected tri- and tetra-nucleotide loci were polymerase chain reaction (PCR amplified and 18 were polymorphic. An additional six cross-species loci were amplified, and the resulting 24 loci were incorporated into eight PCR multiplexes. Multi-locus genotypes yielded an average of 61% (39%–77% heterozygosity and 3.7 (2–6 alleles per locus. Population-level studies using the developed microsatellites will track the invasion front and monitor population-suppression dynamics. Additionally, cross-species amplification was detected in the invasive Ball, P. regius, and Northern African python, P. sebae. These markers can be used to address the hybridization potential of Burmese pythons and the larger, more aggressive P. sebae.

  9. Rapid microsatellite marker development using next generation pyrosequencing to inform invasive Burmese python -- Python molurus bivittatus -- management

    Science.gov (United States)

    Hunter, Margaret E.; Hart, Kristen M.

    2013-01-01

    Invasive species represent an increasing threat to native ecosystems, harming indigenous taxa through predation, habitat modification, cross-species hybridization and alteration of ecosystem processes. Additionally, high economic costs are associated with environmental damage, restoration and control measures. The Burmese python, Python molurus bivittatus, is one of the most notable invasive species in the US, due to the threat it poses to imperiled species and the Greater Everglades ecosystem. To address population structure and relatedness, next generation sequencing was used to rapidly produce species-specific microsatellite loci. The Roche 454 GS-FLX Titanium platform provided 6616 di-, tri- and tetra-nucleotide repeats in 117,516 sequences. Using stringent criteria, 24 of 26 selected tri- and tetra-nucleotide loci were polymerase chain reaction (PCR) amplified and 18 were polymorphic. An additional six cross-species loci were amplified, and the resulting 24 loci were incorporated into eight PCR multiplexes. Multi-locus genotypes yielded an average of 61% (39%–77%) heterozygosity and 3.7 (2–6) alleles per locus. Population-level studies using the developed microsatellites will track the invasion front and monitor population-suppression dynamics. Additionally, cross-species amplification was detected in the invasive Ball, P. regius, and Northern African python, P. sebae. These markers can be used to address the hybridization potential of Burmese pythons and the larger, more aggressive P. sebae.

  10. Analysis of rupture area of aftershocks caused by twin earthquakes (Case study: 11 April 2012 earthquakes of Aceh-North Sumatra)

    Energy Technology Data Exchange (ETDEWEB)

    Diansari, Angga Vertika, E-mail: anggav.bmkg@gmail.com; Purwana, Ibnu; Subakti, Hendri [Academy of Meteorology and Geophysics, Jalan Perhubungan I no.5 Tangerang 15221 (Indonesia)

    2015-04-24

    The 11 April 2012 earthquakes off-shore Aceh-North Sumatra are unique events for the history of Indonesian earthquake. It is unique because that they have similar magnitude, 8.5 Mw and 8.1 Mw; close to epicenter distance, similar strike-slip focal mechanism, and occuring in outer rise area. The purposes of this research are: (1) comparing area of earthquakes base on models and that of calculation, (2) fitting the shape and the area of earthquake rupture zones, (3) analyzing the relationship between rupture area and magnitude of the earthquakes. Rupture area of the earthquake fault are determined by using 4 different formulas, i.e. Utsu and Seki (1954), Wells and Coppersmith (1994), Ellsworth (2003), and Christophersen and Smith (2000). The earthquakes aftershock parameters are taken from PGN (PusatGempabumiNasional or National Earthquake Information Center) of BMKG (Indonesia Agency Meteorology Climatology and Geophysics). The aftershock epicenters are plotted by GMT’s software. After that, ellipse and rectangular models of aftershock spreading are made. The results show that: (1) rupture areas were calculated using magnitude relationship which are larger than the the aftershock distributions model, (2) the best fitting model for that earthquake aftershock distribution is rectangular associated with Utsu and Seki (1954) formula, (3) the larger the magnitude of the earthquake, the larger area of the fault.

  11. Complementary processing of haptic information by slowly and rapidly adapting neurons in the trigeminothalamic pathway. Electrophysiology, mathematical modeling and simulations of vibrissae-related neurons.

    Directory of Open Access Journals (Sweden)

    Abel eSanchez-Jimenez

    2013-06-01

    Full Text Available Tonic (slowly adapting and phasic (rapidly adapting primary afferents convey complementary aspects of haptic information to the central nervous system: object location and texture the former, shape the latter. Tonic and phasic neural responses are also recorded in all relay stations of the somatosensory pathway, yet it is unknown their role in both, information processing and information transmission to the cortex: we don’t know if tonic and phasic neurons process complementary aspects of haptic information and/or if these two types constitute two separate channels that convey complementary aspects of tactile information to the cortex. Here we propose to elucidate these two questions in the fast trigeminal pathway of the rat (PrV-VPM: principal trigeminal nucleus-ventroposteromedial thalamic nucleus. We analyze early and global behavior, latencies and stability of the responses of individual cells in PrV and medial lemniscus under 1-40 Hz stimulation of the whiskers in control and decorticated animals and we use stochastic spiking models and extensive simulations. Our results strongly suggest that in the first relay station of the somatosensory system (PrV: 1 tonic and phasic neurons process complementary aspects of whisker-related tactile information 2 tonic and phasic responses are not originated from two different types of neurons 3 the two responses are generated by the differential action of the somatosensory cortex on a unique type of PrV cell 4 tonic and phasic neurons do not belong to two different channels for the transmission of tactile information to the thalamus 5 trigeminothalamic transmission is exclusively performed by tonically firing neurons and 6 all aspects of haptic information are coded into low-pass, band-pass and high-pass filtering profiles of tonically firing neurons. Our results are important for both, basic research on neural circuits and information processing, and development of sensory neuroprostheses.

  12. On some methods for assessing earthquake predictions

    Science.gov (United States)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  13. Mass wasting triggered by the 5 March 1987 Ecuador earthquakes

    Science.gov (United States)

    Schuster, R.L.; Nieto, A.S.; O'Rourke, T. D.; Crespo, E.; Plaza-Nieto, G.

    1996-01-01

    On 5 March 1987, two earthquakes (Ms=6.1 and Ms=6.9) occurred about 25 km north of Reventador Volcano, along the eastern slopes of the Andes Mountains in northeastern Ecuador. Although the shaking damaged structures in towns and villages near the epicentral area, the economic and social losses directly due to earthquake shaking were small compared to the effects of catastrophic earthquake-triggered mass wasting and flooding. About 600 mm of rain fell in the region in the month preceding the earthquakes; thus, the surficial soils had high moisture contents. Slope failures commonly started as thin slides, which rapidly turned into fluid debris avalanches and debris flows. The surficial soils and thick vegetation covering them flowed down the slopes into minor tributaries and then were carried into major rivers. Rock and earth slides, debris avalanches, debris and mud flows, and resulting floods destroyed about 40 km of the Trans-Ecuadorian oil pipeline and the only highway from Quito to Ecuador's northeastern rain forests and oil fields. Estimates of total volume of earthquake-induced mass wastage ranged from 75-110 million m3. Economic losses were about US$ 1 billion. Nearly all of the approximately 1000 deaths from the earthquakes were a consequence of mass wasting and/ or flooding.

  14. Posttraumatic stress disorder: a serious post-earthquake complication.

    Science.gov (United States)

    Farooqui, Mudassir; Quadri, Syed A; Suriya, Sajid S; Khan, Muhammad Adnan; Ovais, Muhammad; Sohail, Zohaib; Shoaib, Samra; Tohid, Hassaan; Hassan, Muhammad

    2017-01-01

    Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD) occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD. A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD), posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  15. Posttraumatic stress disorder: a serious post-earthquake complication

    Directory of Open Access Journals (Sweden)

    Mudassir Farooqui

    Full Text Available Abstract Objectives Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD Method A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD, posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. Results It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. Conclusion The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  16. Recurrence Times of Earthquakes in Oaxaca, México

    Science.gov (United States)

    Nunez-Cornu, F. J.

    2012-12-01

    Oaxaca is the most seismic active region in Mexico with 68 larger events, (mb > 6.5; Ms> 7.0) from 1542 to 1989, which implies roughly a large earthquake every 6.5 years; including an earthquake with M=8.5 which generate the most important historical tsunami in Mexico. It is also the most studied from a seismic point of view. Three types of earthquakes take place in the region: low angle thrust fault (associated to the subduction process) with a depth between 15 to 25 km; normal fault with a depth between 65 and 120 km with epicenters north of Oaxaca City (17°N); normal fault with a depth between 25 to 40 km with epicenters between the coast and Oaxaca City. A seismogenic zoning based in seismic, tectonic and historical seismicity studies zones was proposed in 1989; eight zones were defined, two zone along the coast, one for the isthmus and rest inland. For most of them a characteristic earthquake (from the earthquakes occurred in the previous 61 years) was assigned and several models of recurrence times for the different zones were proposed, in some cases this values ( 94, 80, 68 and 13 years) have a standard deviation error of 20%. 23 Years later, 4 larger earthquake have occurred in the region that seems agreed with the recurrence models proposed. Here the models are revised using the information from the recent earthquakes and new studies in the region

  17. Procedures and toolsused in the investigationof New Zealand's historical earthquakes

    Directory of Open Access Journals (Sweden)

    G. L. Downes

    2004-06-01

    Full Text Available New Zealand?s tectonic setting, astride an obliquely convergent tectonic boundary, means that it has experienced many large earthquakes in its 200-year written historical records. The task of identifying and studying the largest early instrumental and pre-instrumental earthquakes, as well as identifying the smaller events, is being actively pursued in order to reduce gaps in knowledge and to ensure as complete and comprehensive a catalogue as is possible. The task of quantifying historical earthquake locations and magnitudes is made difficult by several factors. These include the range of possible earthquake focal depths, and the sparse, temporally- and spatially-variable historical population distribution which affects the availability of felt intensity information, and hence, the completeness levels of the catalogue. This paper overviews the procedures and tools used in the analysis, parameterisation, and recording of historical New Zealand earthquakes, with examples from recently studied historical events. In particular, the 1855 M 8+ Wairarapa earthquake is discussed, as well as its importance for the eminent 19th century British geologist, Sir Charles Lyell, and for future global understanding of the connection between large earthquakes and sudden uplift, tilting and faulting on a regional scale.

  18. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  19. Using a UAV for collecting information about a deep-seated landslide in the island of Lefkada following the 17 November 2015 strike-slip earthquake (M=6.5)

    Science.gov (United States)

    Valkaniotis, Sotirios; Ganas, Athanassios; Papathanassiou, George

    2017-04-01

    Documentation of landslides is a very critical issue because effective protection and mitigation measures can be designed only if they are based on the accuracy of the provided information. Such a documentation aims at a detailed description of the basic geomorphological features e.g. edge, traces, scarp etc. while variables such as the landslide area and the volume of the area (that moved) are also measured. However, it is well known that the mapping of these features is not always feasible due to several adverse factors e.g. vertical slopes, high risk. In order to overcome this issue, remote sensing techniques were applied during the last decades. In particular, Interferometric Synthetic Aperture Radar (InSAR), Light Detection and Ranging (LiDAR) and photogrammetric surveys are used for geomorphic mapping in order to quantify landslide processes. The latter one, photogrammetric survey, is frequently conducted by use of Unmanned Aerial Vehicles (UAV), such as multicopters that are flexible in operating conditions and can be equipped with webcams, digital cameras and other sensors. In addition, UAV is considered as a low-cost imaging technique that offers a very high spatial-temporal resolution and flexibility in data acquisition programming. The goal of this study is to provide quantitative data regarding a deep-seated landslide triggered by the 17 November 2015, Greece earthquake (M=6.5; Ganas et al., 2016) in a coastal area of Lefkada, that was not accessible by foot and accordingly, a UAV was used in order to collect the essential information. Ganas, A., et al., Tectonophysics, http://dx.doi.org/10.1016/j.tecto.2016.08.012

  20. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Science.gov (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  1. Acceptability of an intelligent wireless sensor system for the rapid detection of health issues: findings among home-dwelling older adults and their informal caregivers.

    Science.gov (United States)

    Cohen, Christine; Kampel, Thomas; Verloo, Henk

    2016-01-01

    Aging at home rather than in an institution is now considered the gold standard. Public health figures document an important demographic transition to an increasingly elderly society. At the same time, this is accompanied by the emergence of significant numbers of innovative technologies to help and support home-dwelling older adults in declining health who wish to remain at home. To explore the acceptability of intelligent wireless sensor system (IWSS) among home-dwelling older adults in rapidly detecting their health issues. Data were sourced from a pilot 3-month randomized clinical trial that involved 34 older patients in the experimental group (EG) using an IWSS to rapidly detect falls and other health issues at home. The effectiveness of the IWSS was assessed by comparing it to participants' functional and cognitive status, as measured both before and after the trial. The Resident Assessment Instrument for Home Care, Confusion Assessment Method, Cognitive Performance Scale, Geriatric Depression Scale, and Informed Questionnaire on Cognitive Decline in the Elderly were used for the assessments. Acceptability of the IWSS was explored at the end of the study. Both older adults and their informal caregivers considered the performance and usefulness of the IWSS intervention to be low to moderate. A majority of the participants were unsatisfied with its ease of use and found multiple obstacles in using and having an intention to use the IWSS. However, their informal caregivers were more satisfied with the program and gave higher scores for usefulness, ease of use, and intention to use IWSS technology. The IWSS displayed low-to-moderate acceptability among the older participants and their informal caregivers. We recommend improving and clarifying several components in the IWSS for the development of a design that is user-centered.

  2. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  3. Middle school students' earthquake content and preparedness knowledge - A mixed method study

    Science.gov (United States)

    Henson, Harvey, Jr.

    The purpose of this study was to assess the effect of earthquake instruction on students' earthquake content and preparedness for earthquakes. This study used an innovative direct instruction on earthquake science content and concepts with an inquiry-based group activity on earthquake safety followed by an earthquake simulation and preparedness video to help middle school students understand and prepare for the regional seismic threat. A convenience sample of 384 sixth and seventh grade students at two small middle schools in southern Illinois was used in this study. Qualitative information was gathered using open-ended survey questions, classroom observations, and semi-structured interviews. Quantitative data were collected using a 21 item content questionnaire administered to test students' General Earthquake Knowledge, Local Earthquake Knowledge, and Earthquake Preparedness Knowledge before and after instruction. A pre-test and post-test survey Likert scale with 21 items was used to collect students' perceptions and attitudes. Qualitative data analysis included quantification of student responses to the open-ended questions and thematic analysis of observation notes and interview transcripts. Quantitative datasets were analyzed using descriptive and inferential statistical methods, including t tests to evaluate the differences in means scores between paired groups before and after interventions and one-way analysis of variance (ANOVA) to test for differences between mean scores of the comparison groups. Significant mean differences between groups were further examined using a Dunnett's C post hoc statistical analysis. Integration and interpretation of the qualitative and quantitative results of the study revealed a significant increase in general, local and preparedness earthquake knowledge among middle school students after the interventions. The findings specifically indicated that these students felt most aware and prepared for an earthquake after an

  4. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  5. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  6. Earthquake Early Warning for the 2016 Kumamoto Earthquake: Performance Evaluation of the Current System and Simulations of the Next-Generation Methods of the Japan Meteorological Agency

    Science.gov (United States)

    Kodera, Y.; Saitou, J.; Hayashimoto, N.; Adachi, S.; Morimoto, M.; Nishimae, Y.; Hoshiba, M.

    2016-12-01

    The sequence of the 2016 Kumamoto earthquake (the Kumamoto earthquake) is extremely high seismic activity that occurred across Kumamoto and Oita prefectures in Japan since April 14, 2016 at 21:26 (JST). The Earthquake Early Warning (EEW) system of the Japan Meteorological Agency (JMA) processed large amount of earthquake data and issued 19 warnings (higher grade alerts) and 175 forecasts (lower grade alerts) for the Kumamoto earthquake from April 14 to 30. Especially on April 14 and 16, the system operated under one of the highest workload conditions since JMA started its EEW service. We evaluated the system performance for cases where the warning was issued and/or the maximum seismic intensity of ≥5L (on the JMA scale) was actually observed, calculating prediction scores and lapse times from detection. The result shows that the system rapidly disseminated high accurate EEW announcements for most of the devastating earthquakes and did not miss or seriously under-predict strong motions. On the other hand, the system issued over-predicted warnings when multiple simultaneous earthquakes occurred within a short distance, comparable with the interval of the seismic observation network. We also simulated the Integrated Particle Filter (IPF) and Propagation of Local Undamped Motion (PLUM) methods, scheduled to be implemented in the JMA EEW system to minimize over-prediction with multiple simultaneous earthquakes and under-prediction with huge earthquakes (M> 8). The simulation results indicate that the IPF method is highly effective for the cases where the current system issued the over-predicted warnings, owing to its classification algorithm using amplitude data and its robust hypocenter determination against outliers among trigger data. The results also show that the PLUM method contributes to more rapid warning issuance for the devastating earthquakes, owing to its denser seismic observation network.

  7. Dynamic strains for earthquake source characterization

    Science.gov (United States)

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  8. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    Science.gov (United States)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  9. Earthquake Source Mechanics

    Science.gov (United States)

    The past 2 decades have seen substantial progress in our understanding of the nature of the earthquake faulting process, but increasingly, the subject has become an interdisciplinary one. Thus, although the observation of radiated seismic waves remains the primary tool for studying earthquakes (and has been increasingly focused on extracting the physical processes occurring in the “source”), geological studies have also begun to play a more important role in understanding the faulting process. Additionally, defining the physical underpinning for these phenomena has come to be an important subject in experimental and theoretical rock mechanics.In recognition of this, a Maurice Ewing Symposium was held at Arden House, Harriman, N.Y. (the former home of the great American statesman Averill Harriman), May 20-23, 1985. The purpose of the meeting was to bring together the international community of experimentalists, theoreticians, and observationalists who are engaged in the study of various aspects of earthquake source mechanics. The conference was attended by more than 60 scientists from nine countries (France, Italy, Japan, Poland, China, the United Kingdom, United States, Soviet Union, and the Federal Republic of Germany).

  10. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  11. The 2016 Central Italy Earthquake: an Overview

    Science.gov (United States)

    Amato, A.

    2016-12-01

    The M6 central Italy earthquake occurred on the seismic backbone of the Italy, just in the middle of the highest hazard belt. The shock hit suddenly during the night of August 24, when people were asleep; no foreshocks occurred before the main event. The earthquake ruptured from 10 km to the surface, and produced a more than 17,000 aftershocks (Oct. 19) spread on a 40x20 km2 area elongated NW-SE. It is geologically very similar to previous recent events of the Apennines. Both the 2009 L'Aquila earthquake to the south and the 1997 Colfiorito to the north, were characterized by the activation of adjacent fault segments. Despite its magnitude and the well known seismic hazard of the region, the earthquake produced extensive damage and 297 fatalities. The town of Amatrice, that paid the highest toll, was classified in zone 1 (the highest) since 1915, but the buildings in this and other villages revealed highly vulnerable. In contrast, in the town of Norcia, that also experienced strong ground shaking, no collapses occurred, most likely due to the retrofitting carried out after an earthquake in 1979. Soon after the quake, the INGV Crisis Unit convened at night in the Rome headquarters, in order to coordinate the activities. The first field teams reached the epicentral area at 7 am with the portable seismic stations installed to monitor the aftershocks; other teams followed to map surface faults, damage, to measure GPS sites, to install instruments for site response studies, and so on. The INGV Crisis Unit includes the Press office and the INGVterremoti team, in order to manage and coordinate the communication towards the Civil Protection Dept. (DPC), the media and the web. Several tens of reports and updates have been delivered in the first month of the sequence to DPC. Also due to the controversial situation arisen from the L'Aquila earthquake and trials, particular attention was given to the communication: continuous and timely information has been released to

  12. Real-time performance of probabilistic, first-motion earthquake mechanisms to improve tsunami early-warning

    Science.gov (United States)

    Lomax, Anthony; Michelni, Alberto; Bernardi, Fabrizio; Scognamiglio, Laura

    2017-04-01

    The first tsunami warning messages are typically based on simple earthquake parameters: epicenter location, hypocenter depth, and magnitude. The addition of early information on the faulting mechanism can enable more reliable estimates of seafloor uplift, tsunami excitation, tsunami potential and impact, and earlier, real-time tsunami scenario forecasting. Full-waveform, centroid moment tensor solutions (CMT) are typically available in 3-15min for local/near-regional earthquakes and in 10-30min for regional/teleseismic distances. In contrast, classic, P first-motion (FM) focal-mechanisms can be available within 3min for local/near-regional events and in 5-10 min for regional/teleseismic distances. We present fmamp, a robust, probabilistic, adaptive grid-search, FM mechanism determination procedure which generates a comprehensive set of "acceptable" FM mechanisms and related uncertainties. This FM solution, combined with fast magnitude estimates such as Mwp, forms a CMT proxy for rapid source characterization and analysis before a definitive, waveform CMT is available. Currently, fmamp runs in real-time in Early-est*, the module for rapid earthquake detection, location and analysis at the INGV tsunami alert center (CAT, "Centro di Allerta Tsunami"), part of the Italian, candidate Tsunami Watch Provider. We show the real-time performance of fmamp and compare its speed and accuracy to CMT results. For large earthquakes in areas of sparse seismic station coverage, fmamp mechanisms are typically available in 5-10min, while CMT results take 10-30min. The fmamp solutions usually agree with CMT results for larger events, but sometimes differ, due to insufficient or noisy FM readings, or real difference between the FM mechanism, representing the faulting at the hypocenter, and the CMT mechanism representing some average, centroid faulting. * http://early-est.alomax.net, http://early-est.rm.ingv.it, http://alomax.free.fr/posters/early-est

  13. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  14. Effect of interior changes on earthquake resistance of buildings - case: reinforced concrete frame system

    OpenAIRE

    Moosavi, Mahsa Sadat Fard

    2013-01-01

    Earthquakes are one of the most disturbing natural hazards which cause enormous life and property losses. However, earthquake engineering has highlighted itself as an interdisciplinary subject over the past few decades. Different professions as seismology, structural and geotechnical engineering, architecture, urban planning, information technology and some of the social sciences, have began to address different characteristic effects on the earthquake resistance of buildings. The purpose ...

  15. Along-trench variation in seafloor displacements after the 2011 Tohoku earthquake.

    Science.gov (United States)

    Tomita, Fumiaki; Kido, Motoyuki; Ohta, Yusaku; Iinuma, Takeshi; Hino, Ryota

    2017-07-01

    The 2011 Tohoku-oki earthquake was the largest earthquake ever observed with seafloor geodetic techniques in and around its source region. Large crustal deformation associated with both the coseismic rupture and the rapid postseismic deformation has been reported. However, these observations are insufficient to describe the postseismic deformation processes occurring around the broad rupture area. We report the first results of seafloor Global Positioning System and acoustic ranging (GPS-A) observations based on repeated campaign surveys conducted over nearly 4 years using the extended GPS-A network deployed along the Japan Trench in September 2012. The observed postseismic displacement rates (DRs) show evident spatial variation along the trench: (i) distinct landward DRs in the large coseismic slip area [primary rupture area (PRA)], evidencing the predominance of viscoelastic relaxation; (ii) remarkable trenchward DRs in the south of the PRA, indicating rapid afterslip; and (iii) slight trenchward DRs in the north of the PRA. These features provide great insights into constructing a more complete model of viscoelastic relaxation, and they also indicate spatial variation of afterslip and fault locking along the plate interface with clear spatial resolution, providing invaluable information for the improvement of seismic hazard assessment.

  16. Ion torrent personal genome machine sequencing for genomic typing of Neisseria meningitidis for rapid determination of multiple layers of typing information.

    Science.gov (United States)

    Vogel, Ulrich; Szczepanowski, Rafael; Claus, Heike; Jünemann, Sebastian; Prior, Karola; Harmsen, Dag

    2012-06-01

    Neisseria meningitidis causes invasive meningococcal disease in infants, toddlers, and adolescents worldwide. DNA sequence-based typing, including multilocus sequence typing, analysis of genetic determinants of antibiotic resistance, and sequence typing of vaccine antigens, has become the standard for molecular epidemiology of the organism. However, PCR of multiple targets and consecutive Sanger sequencing provide logistic constraints to reference laboratories. Taking advantage of the recent development of benchtop next-generation sequencers (NGSs) and of BIGSdb, a database accommodating and analyzing genome sequence data, we therefore explored the feasibility and accuracy of Ion Torrent Personal Genome Machine (PGM) sequencing for genomic typing of meningococci. Three strains from a previous meningococcus serogroup B community outbreak were selected to compare conventional typing results with data generated by semiconductor chip-based sequencing. In addition, sequencing of the meningococcal type strain MC58 provided information about the general performance of the technology. The PGM technology generated sequence information for all target genes addressed. The results were 100% concordant with conventional typing results, with no further editing being necessary. In addition, the amount of typing information, i.e., nucleotides and target genes analyzed, could be substantially increased by the combined use of genome sequencing and BIGSdb compared to conventional methods. In the near future, affordable and fast benchtop NGS machines like the PGM might enable reference laboratories to switch to genomic typing on a routine basis. This will reduce workloads and rapidly provide information for laboratory surveillance, outbreak investigation, assessment of vaccine preventability, and antibiotic resistance gene monitoring.

  17. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  18. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt

    1996-06-01

    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  19. Earthquake Loss Scenarios: Warnings about the Extent of Disasters

    Science.gov (United States)

    Wyss, M.; Tolis, S.; Rosset, P.

    2016-12-01

    It is imperative that losses expected due to future earthquakes be estimated. Officials and the public need to be aware of what disaster is likely in store for them in order to reduce the fatalities and efficiently help the injured. Scenarios for earthquake parameters can be constructed to a reasonable accuracy in highly active earthquake belts, based on knowledge of seismotectonics and history. Because of the inherent uncertainties of loss estimates however, it would be desirable that more than one group calculate an estimate for the same area. By discussing these estimates, one may find a consensus of the range of the potential disasters and persuade officials and residents of the reality of the earthquake threat. To model a scenario and estimate earthquake losses requires data sets that are sufficiently accurate of the number of people present, the built environment, and if possible the transmission of seismic waves. As examples we use loss estimates for possible repeats of historic earthquakes in Greece that occurred between -464 and 700. We model future large Greek earthquakes as having M6.8 and rupture lengths of 60 km. In four locations where historic earthquakes with serious losses have occurred, we estimate that 1,000 to 1,500 people might perish, with an additional factor of four people injured. Defining the area of influence of these earthquakes as that with shaking intensities larger and equal to V, we estimate that 1.0 to 2.2 million people in about 2,000 settlements may be affected. We calibrate the QLARM tool for calculating intensities and losses in Greece, using the M6, 1999 Athens earthquake and matching the isoseismal information for six earthquakes, which occurred in Greece during the last 140 years. Comparing fatality numbers that would occur theoretically today with the numbers reported, and correcting for the increase in population, we estimate that the improvement of the building stock has reduced the mortality and injury rate in Greek

  20. Identification at the crime scene: The sooner, the better? The interpretation of rapid identification information by CSIs at the crime scene.

    Science.gov (United States)

    de Gruijter, Madeleine; Nee, Claire; de Poot, Christianne J

    2017-07-01

    New technologies will allow Crime Scene Investigators (CSIs) in the near future to analyse traces at the crime scene and receive identification information while still conducting the investigation. These developments could have considerable effects on the way an investigation is conducted. CSIs may start reasoning based on possible database-matches which could influence scenario formation (i.e. the construction of narratives that explain the observed traces) during very early phases of the investigation. The goal of this study is to gain more insight into the influence of the rapid identification information on the reconstruction of the crime and the evaluation of traces by addressing two questions, namely 1) is scenario formation influenced from the moment that ID information is provided and 2) do database matches influence the evaluation of traces and the reconstruction of the crime. We asked 48 CSIs from England to investigate a potential murder crime scene on a computer. Our findings show that the interpretation of the crime scene by CSIs is affected by the moment identification information is provided. This information has a higher influence on scenario formation when provided after an initial scenario has been formed. Also, CSIs seem to attach great value to traces that produce matches with databases and hence yield a name of a known person. Similar traces that did not provide matches were considered less important. We question whether this kind of selective attention is desirable as it may cause ignorance of other relevant information at the crime scene. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.

  1. OPERATIONAL EARTHQUAKE FORECASTING. State of Knowledge and Guidelines for Utilization

    Directory of Open Access Journals (Sweden)

    Koshun Yamaoka

    2011-08-01

    earthquake forecasting as the principle means for gathering and disseminating authoritative information about time-dependent seismic hazards to help communities prepare for potentially destructive earthquakes. On short time scales of days and weeks, earthquake sequences show clustering in space and time, as indicated by the aftershocks triggered by large events. Statistical descriptions of clustering explain many features observed in seismicity catalogs, and they can be used to construct forecasts that indicate how earthquake probabilities change over the short term. Properly applied, short-term forecasts have operational utility; for example, in anticipating aftershocks that follow large earthquakes. Although the value of long-term forecasts for ensuring seismic safety is clear, the interpretation of short-term forecasts is problematic, because earthquake probabilities may vary over orders of magnitude but typically remain low in an absolute sense (< 1% per day. Translating such low-probability forecasts into effective decision-making is a difficult challenge. Reports on the current utilization operational forecasting in earthquake risk management were compiled for six countries with high seismic risk: China, Greece, Italy, Japan, Russia, United States. Long-term models are currently the most important forecasting tools for civil protection against earthquake damage, because they guide earthquake safety provisions of building codes, performance-based seismic design, and other risk-reducing engineering practices, such as retrofitting to correct design flaws in older buildings. Short-term forecasting of aftershocks is practiced by several countries among those surveyed, but operational earthquake forecasting has not been fully implemented (i.e., regularly updated and on a national scale in any of them. Based on the experience accumulated in seismically active regions, the ICEF has provided to DPC a set of recommendations on the utilization of operational forecasting in Italy

  2. Real-time earthquake monitoring for tsunami warning in the Indian Ocean and beyond

    Science.gov (United States)

    Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Harjadi, P.; Fauzi; Gitews Seismology Group

    2010-12-01

    The Mw = 9.3 Sumatra earthquake of 26 December 2004 generated a tsunami that affected the entire Indian Ocean region and caused approximately 230 000 fatalities. In the response to this tragedy the German government funded the German Indonesian Tsunami Early Warning System (GITEWS) Project. The task of the GEOFON group of GFZ Potsdam was to develop and implement the seismological component. In this paper we describe the concept of the GITEWS earthquake monitoring system and report on its present status. The major challenge for earthquake monitoring within a tsunami warning system is to deliver rapid information about location, depth, size and possibly other source parameters. This is particularly true for coast lines adjacent to the potential source areas such as the Sunda trench where these parameters are required within a few minutes after the event in order to be able to warn the population before the potential tsunami hits the neighbouring coastal areas. Therefore, the key for a seismic monitoring system with short warning times adequate for Indonesia is a dense real-time seismic network across Indonesia with densifications close to the Sunda trench. A substantial number of supplementary stations in other Indian Ocean rim countries are added to strengthen the teleseismic monitoring capabilities. The installation of the new GITEWS seismic network - consisting of 31 combined broadband and strong motion stations - out of these 21 stations in Indonesia - is almost completed. The real-time data collection is using a private VSAT communication system with hubs in Jakarta and Vienna. In addition, all available seismic real-time data from the other seismic networks in Indonesia and other Indian Ocean rim countries are acquired also directly by VSAT or by Internet at the Indonesian Tsunami Warning Centre in Jakarta and the resulting "virtual" network of more than 230 stations can jointly be used for seismic data processing. The seismological processing software as part

  3. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  4. SU-E-T-23: A Developing Australian Network for Datamining and Modelling Routine Radiotherapy Clinical Data and Radiomics Information for Rapid Learning and Clinical Decision Support

    Energy Technology Data Exchange (ETDEWEB)

    Thwaites, D [University of Sydney, Camperdown, Sydney (Australia); Holloway, L [Ingham Institute, Sydney, NSW (Australia); Bailey, M; Carolan, M; Miller, A [Illawarra Cancer Care Centre, Wollongong, NSW (Australia); Barakat, S; Field, M [University of Sydney, Sydney, NSW (Australia); Delaney, G; Vinod, S [Liverpool Hospital, Liverpool, NSW (Australia); Dekker, A [Maastro Clinic, Maastricht (Netherlands); Lustberg, T; Soest, J van; Walsh, S [MAASTRO Clinic, Maastricht (Netherlands)

    2015-06-15

    Purpose: Large amounts of routine radiotherapy (RT) data are available, which can potentially add clinical evidence to support better decisions. A developing collaborative Australian network, with a leading European partner, aims to validate, implement and extend European predictive models (PMs) for Australian practice and assess their impact on future patient decisions. Wider objectives include: developing multi-institutional rapid learning, using distributed learning approaches; and assessing and incorporating radiomics information into PMs. Methods: Two initial standalone pilots were conducted; one on NSCLC, the other on larynx, patient datasets in two different centres. Open-source rapid learning systems were installed, for data extraction and mining to collect relevant clinical parameters from the centres’ databases. The European DSSs were learned (“training cohort”) and validated against local data sets (“clinical cohort”). Further NSCLC studies are underway in three more centres to pilot a wider distributed learning network. Initial radiomics work is underway. Results: For the NSCLC pilot, 159/419 patient datasets were identified meeting the PM criteria, and hence eligible for inclusion in the curative clinical cohort (for the larynx pilot, 109/125). Some missing data were imputed using Bayesian methods. For both, the European PMs successfully predicted prognosis groups, but with some differences in practice reflected. For example, the PM-predicted good prognosis NSCLC group was differentiated from a combined medium/poor prognosis group (2YOS 69% vs. 27%, p<0.001). Stage was less discriminatory in identifying prognostic groups. In the good prognosis group two-year overall survival was 65% in curatively and 18% in palliatively treated patients. Conclusion: The technical infrastructure and basic European PMs support prognosis prediction for these Australian patient groups, showing promise for supporting future personalized treatment decisions

  5. The Early Warning System(EWS) as First Stage to Generate and Develop Shake Map for Bucharest to Deep Vrancea Earthquakes

    Science.gov (United States)

    Marmureanu, G.; Ionescu, C.; Marmureanu, A.; Grecu, B.; Cioflan, C.

    2007-12-01

    EWS made by NIEP is the first European system for real-time early detection and warning of the seismic waves in case of strong deep earthquakes. EWS uses the time interval (28-32 seconds) between the moment when earthquake is detected by the borehole and surface local accelerometers network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area, to deliver timely integrated information in order to enable actions to be taken before a main destructive shaking takes place. Early warning system is viewed as part of an real-time information system that provide rapid information, about an earthquake impeding hazard, to the public and disaster relief organizations before (early warning) and after a strong earthquake (shake map).This product is fitting in with other new product on way of National Institute for Earth Physics, that is, the shake map which is a representation of ground shaking produced by an event and it will be generated automatically following large Vrancea earthquakes. Bucharest City is located in the central part of the Moesian platform (age: Precambrian and Paleozoic) in the Romanian Plain, at about 140 km far from Vrancea area. Above a Cretaceous and a Miocene deposit (with the bottom at roundly 1,400 m of depth), a Pliocene shallow water deposit (~ 700m thick) was settled. The surface geology consists mainly of Quaternary alluvial deposits. Later loess covered these deposits and the two rivers crossing the city (Dambovita and Colentina) carved the present landscape. During the last century Bucharest suffered heavy damage and casualties due to 1940 (Mw = 7.7) and 1977 (Mw = 7.4) Vrancea earthquakes. For example, 32 high tall buildings collapsed and more then 1500 people died during the 1977 event. The innovation with comparable or related systems worldwide is that NIEP will use the EWS to generate a virtual shake map for Bucharest (140 km away of epicentre) immediately after the magnitude is estimated

  6. Seismic imaging under the 2013 Ms 7.0 Lushan Earthquake, China

    Science.gov (United States)

    Wang, Z.

    2013-12-01

    On 20 April 2013, a large earthquake (Ms 7.0) occurred at southern end of the Longmen-Shan fault zone. More than 200 people were killed and about 14,000 people were hurt by the earthquake. The earthquake occurred with some distinct features: 1) The hypocenter of the Lushan earthquake is located close to the devastating 2008 M 8.0 Wenchuan earthquake occurred at the Longmen-Shan fault zone; 2) The time scale of the earthquake generation is not more than five years after the M 8.0 earthquake; 3) The magnitude of the Lushan earthquake is as large as 7.0 within such a close time of the Wenchuan earthquake; 4) The hypocenter of the Lushan earthquake seems to be located at the southern part of the Longmen-Shan faut zone with a 70-km distance away from the Wenchuan source. These features of the Lushan earthquake leads to a number of researcher wonder of its nucleation mechanism, rupture process and the correlation of the wenchuan earthquakes. Global seismic waveform data analyzing shows where the rupture initiated and how it expanded for the 2013 Ms 7.0 Lushan earthquake. Our seismic imaging and crustal stress analyzing indicates that the hypocenter of the Lushan earthquake occurred at a strong high-velocity (Vp, Vs) and low-Poisson's ratio zone with high crustal stress. Similarly, high-velocity (Vp, Vs) and low-Poisson's ratio with high crustal stress is revealed under the 2008 Wenchuan earthquake (Ms 8.0) source area. However, a sharp contrast gap zone with low-velocity, high-Poisson's ratio anomalies is clearly imaged under the conjunction area between the two earthquake sources. We suggest that the strong structural variation and high crustal stress together with high coseismic stress by the Wenchuan Earthquake triggered the 2013 Lushan Earthquake (Ms 7.0) and controlling its rupture process. We believe that the rapid seismic imaging together with the crustal stress analysis could help to understand the Lushan earthquake generation and to evaluate the possibility of

  7. Acceptability of an intelligent wireless sensor system for the rapid detection of health issues: findings among home-dwelling older adults and their informal caregivers

    Directory of Open Access Journals (Sweden)

    Cohen C

    2016-09-01

    Full Text Available Christine Cohen, Thomas Kampel, Henk Verloo Department Ra&D, La Source School of Nursing Sciences, University of Applied Sciences and Arts Western Switzerland, Lausanne, Switzerland Background: Aging at home rather than in an institution is now considered the gold standard. Public health figures document an important demographic transition to an increasingly elderly society. At the same time, this is accompanied by the emergence of significant numbers of innovative technologies to help and support home-dwelling older adults in declining health who wish to remain at home.Study aim: To explore the acceptability of intelligent wireless sensor system (IWSS among home-dwelling older adults in rapidly detecting their health issues.Methods: Data were sourced from a pilot 3-month randomized clinical trial that involved 34 older patients in the experimental group (EG using an IWSS to rapidly detect falls and other health issues at home. The effectiveness of the IWSS was assessed by comparing it to participants’ functional and cognitive status, as measured both before and after the trial. The Resident Assessment Instrument for Home Care, Confusion Assessment Method, Cognitive Performance Scale, Geriatric Depression Scale, and Informed Questionnaire on Cognitive Decline in the Elderly were used for the assessments. Acceptability of the IWSS was explored at the end of the study.Results: Both older adults and their informal caregivers considered the performance and usefulness of the IWSS intervention to be low to moderate. A majority of the participants were unsatisfied with its ease of use and found multiple obstacles in using and having an intention to use the IWSS. However, their informal caregivers were more satisfied with the program and gave higher scores for usefulness, ease of use, and intention to use IWSS technology.Conclusion: The IWSS displayed low-to-moderate acceptability among the older participants and their informal caregivers. We

  8. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016

    Science.gov (United States)

    Detweiler, Shane; Pollitz, Fred

    2017-10-18

    The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting.

  9. Impact of experience when using the Rapid Upper Limb Assessment to assess postural risk in children using information and communication technologies.

    Science.gov (United States)

    Chen, Janice D; Falkmer, Torbjörn; Parsons, Richard; Buzzard, Jennifer; Ciccarelli, Marina

    2014-05-01

    The Rapid Upper Limb Assessment (RULA) is an observation-based screening tool that has been used to assess postural risks of children in school settings. Studies using eye-tracking technology suggest that visual search strategies are influenced by experience in the task performed. This study investigated if experience in postural risk assessments contributed to differences in outcome scores on the RULA and the visual search strategies utilized. While wearing an eye-tracker, 16 student occupational therapists and 16 experienced occupational therapists used the RULA to assess 11 video scenarios of a child using different mobile information and communication technologies (ICT) in the home environment. No significant differences in RULA outcome scores, and no conclusive differences in visual search strategies between groups were found. RULA can be used as a screening tool for postural risks following a short training session regardless of the assessor's experience in postural risk assessments. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. The Quake-Catcher Network: Improving Earthquake Strong Motion Observations Through Community Engagement

    Science.gov (United States)

    Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.; Chung, A. I.; Neighbors, C.; Saltzman, J.

    2010-12-01

    The Quake-Catcher Network (QCN) involves the community in strong motion data collection by utilizing volunteer computing techniques and low-cost MEMS accelerometers. Volunteer computing provides a mechanism to expand strong-motion seismology with minimal infrastructure costs, while promoting community participation in science. Micro-Electro-Mechanical Systems (MEMS) triaxial accelerometers can be attached to a desktop computer via USB and are internal to many laptops. Preliminary shake table tests show the MEMS accelerometers can record high-quality seismic data with instrument response similar to research-grade strong-motion sensors. QCN began distributing sensors and software to K-12 schools and the general public in April 2008 and has grown to roughly 1500 stations worldwide. We also recently tested whether sensors could be quickly deployed as part of a Rapid Aftershock Mobilization Program (RAMP) following the 2010 M8.8 Maule, Chile earthquake. Volunteers are recruited through media reports, web-based sensor request forms, as well as social networking sites. Using data collected to date, we examine whether a distributed sensing network can provide valuable seismic data for earthquake detection and characterization while promoting community participation in earthquake science. We utilize client-side triggering algorithms to determine when significant ground shaking occurs and this metadata is sent to the main QCN server. On average, trigger metadata are received within 1-10 seconds from the observation of a trigger; the larger data latencies are correlated with greater server-station distances. When triggers are detected, we determine if the triggers correlate to others in the network using spatial and temporal clustering of incoming trigger information. If a minimum number of triggers are detected then a QCN-event is declared and an initial earthquake location and magnitude is estimated. Initial analysis suggests that the estimated locations and magnitudes are

  11. Testing a time-domain regional momtent tensor inversion program for large worldwide earthquakes

    Science.gov (United States)

    Richter, G.; Hoffmann, M.; Hanka, W.; Saul, J.

    2009-04-01

    After gaining an accurate source location and magnitude estimate of large earthquakes the direction of plate movement is the next important information for reliable hazard assessment. For this purpose rapid moment tensor inversions are necessary. In this study the time-domain moment tensor inversion program by Dreger (2001) is tested. This program for regional moment tensor solutions is applied to seismic data from regional stations of the GEOFON net and international cooperating partner networks (InaTEWS, IRIS, GEOFON Extended Virtual Network) to obtain moment tensor solutions for large earthquakes worldwide. The motivation of the study is to have rapid information on the plate motion direction for the verification of tsunami generation hazard by earthquakes. A special interest lies on the application in the Indonesian archipelago to integrate the program in German-Indonesian Tsunami Early Warning System (GITEWS). Performing the inversion on a single CPU of a normal PC most solutions are achieved within half an hour after origin time. The program starts automatically for large earthquakes detected by the seismic analysis tool SeisComP3 (Hanka et al, 2008). The data from seismic stations in the distance range up to 2000 km are selected, prepared and quality controlled. First the program searches the best automatic solution by varying the source depth. Testing different stations combinations for the inversion enables to identify the stability of the solution. For further optimization of the solution the interactive selection of available stations is facilitated. The results of over 200 events are compared to centroid moment tensor solutions from the Global CMT-Project, from MedNet/INGV and NEID to evaluate the accuracy of the results. The inversion in the time-domain is sensitive to uncertainties in the velocity model and in the source location. These resolution limits are visible in the waveform fits. Another reason for misfits are strong structural inhomogeneities

  12. Overview of the critical disaster management challenges faced during Van 2011 earthquakes.

    Science.gov (United States)

    Tolon, Mert; Yazgan, Ufuk; Ural, Derin N; Goss, Kay C

    2014-01-01

    On October 23, 2011, a M7.2 earthquake caused damage in a widespread area in the Van province located in eastern Turkey. This strong earthquake was followed by a M5.7 earthquake on November 9, 2011. This sequence of damaging earthquakes led to 644 fatalities. The management during and after these earthquake disaster imposed many critical challenges. In this article, an overview of these challenges is presented based on the observations by the authors in the aftermath of this disaster. This article presents the characteristics of 2011 Van earthquakes. Afterward, the key information related to the four main phases (ie, preparedness, mitigation, response, and recovery) of the disaster in Van is presented. The potential strategies that can be taken to improve the disaster management practice are identified, and a set of recommendations are proposed to improve the existing situation.

  13. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  14. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  15. A mixed-methods study on perceptions towards use of Rapid Ethical Assessment to improve informed consent processes for health research in a low-income setting.

    Science.gov (United States)

    Addissie, Adamu; Davey, Gail; Newport, Melanie J; Addissie, Thomas; MacGregor, Hayley; Feleke, Yeweyenhareg; Farsides, Bobbie

    2014-05-02

    Rapid Ethical Assessment (REA) is a form of rapid ethnographic assessment conducted at the beginning of research project to guide the consent process with the objective of reconciling universal ethical guidance with specific research contexts. The current study is conducted to assess the perceived relevance of introducing REA as a mainstream tool in Ethiopia. Mixed methods research using a sequential explanatory approach was conducted from July to September 2012, including 241 cross-sectional, self-administered and 19 qualitative, in-depth interviews among health researchers and regulators including ethics committee members in Ethiopian health research institutions and universities. In their evaluation of the consent process, only 40.2% thought that the consent process and information given were adequately understood by study participants; 84.6% claimed they were not satisfied with the current consent process and 85.5% thought the best interests of study participants were not adequately considered. Commonly mentioned consent-related problems included lack of clarity (48.1%), inadequate information (34%), language barriers (28.2%), cultural differences (27.4%), undue expectations (26.6%) and power imbalances (20.7%). About 95.4% believed that consent should be contextualized to the study setting and 39.4% thought REA would be an appropriate approach to improve the perceived problems. Qualitative findings helped to further explore the gaps identified in the quantitative findings and to map-out concerns related to the current research consent process in Ethiopia. Suggestions included, conducting REA during the pre-test (pilot) phase of studies when applicable. The need for clear guidance for researchers on issues such as when and how to apply the REA tools was stressed. The study findings clearly indicated that there are perceived to be correctable gaps in the consent process of medical research in Ethiopia. REA is considered relevant by researchers and stakeholders

  16. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    Science.gov (United States)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

  17. ElarmS Earthquake Early Warning System Updates and Performance

    Science.gov (United States)

    Chung, A. I.; Allen, R. M.; Hellweg, M.; Henson, I. H.; Neuhauser, D. S.

    2015-12-01

    The ElarmS earthquake early warning algorithm has been detecting earthquakes throughout California since 2007. It is one of the algorithms that contributes to CISN's ShakeAlert, a prototype earthquake early warning system being developed for California. Overall, ElarmS performance has been excellent. Over the past year (July 1, 2014 - July 1, 2015), ElarmS successfully detected all but three of the significant earthquakes (M4+) that occurred within California. Of the 24 events that were detected, the most notable was the M6.0 South Napa earthquake that occurred on August 24, 2014. The first alert for this event was sent in 5.1 seconds with an initial magnitude estimate of M5.7. This alert provided approximately 8 seconds of warning of the impending S-wave arrival to the city of San Francisco. The magnitude estimate increased to the final value of M6.0 within 15 seconds of the initial alert. One of the two events that were not detected by ElarmS occurred within 30 seconds of the M6.0 Napa mainshock. The two other missed events occurred offshore in a region with sparse station coverage in the Eureka area. Since its inception, ElarmS has evolved and adapted to meet new challenges. On May 30, 2015, an extraordinarily deep (678km) M7.8 teleseism in Japan generated 5 false event detections for earthquakes greater than M4 within a minute due to the simultaneous arrival of the P-waves at stations throughout California. In order to improve the speed and accuracy of the ElarmS detections, we are currently exploring new methodologies to quickly evaluate incoming triggers from individual stations. Rapidly determining whether or not a trigger at a given station is due to a local earthquake or some other source (such as a distant teleseism) could dramatically increase the confidence in individual triggers and reduce false alerts.

  18. Impacts of the 2010 Haitian earthquake in the diaspora: findings from Little Haiti, Miami, FL.

    Science.gov (United States)

    Kobetz, Erin; Menard, Janelle; Kish, Jonathan; Bishop, Ian; Hazan, Gabrielle; Nicolas, Guerda

    2013-04-01

    In January 2010, a massive earthquake struck Haiti resulting in unprecedented damage. Little attention, however, has focused on the earthquake's mental health impact in the Haitian diaspora community. As part of an established community-based participatory research initiative in Little Haiti, the predominately Haitian neighborhood in Miami, FL, USA, community health workers conducted surveys with neighborhood residents about earthquake-related losses, coping strategies, and depressive/traumatic symptomology. Findings reveal the earthquake strongly impacted the diaspora community and highlights prominent coping strategies. Following the earthquake, only a small percentage of participants self-reported engaging in any negative health behaviors. Instead, a majority relied on their social networks for support. This study contributes to the discourse on designing culturally-responsive mental health initiatives for the Haitian diaspora and the ability of existing community-academic partnerships to rapidly adapt to community needs.

  19. Sea-ice information co-management: Planning for sustainable multiple uses of ice-covered seas in a rapidly changing Arctic

    Science.gov (United States)

    Eicken, H.; Lovecraft, A. L.

    2012-12-01

    A thinner, less extensive and more mobile summer sea-ice cover is a major element and driver of Arctic Ocean change. Declining summer sea ice presents Arctic stakeholders with substantial challenges and opportunities from the perspective of sustainable ocean use and derivation of sea-ice or ecosystem services. Sea-ice use by people and wildlife as well as its role as a major environmental hazard focuses the interests and concerns of indigenous hunters and Arctic coastal communities, resource managers and the maritime industry. In particular, rapid sea-ice change and intensifying offshore industrial activities have raised fundamental questions as to how best to plan for and manage multiple and increasingly overlapping ocean and sea ice uses. The western North American Arctic - a region that has seen some of the greatest changes in ice and ocean conditions in the past three decades anywhere in the North - is the focus of our study. Specifically, we examine the important role that relevant and actionable sea-ice information can play in allowing stakeholders to evaluate risks and reconcile overlapping and potentially competing interests. Our work in coastal Alaska suggests that important prerequisites to address such challenges are common values, complementary bodies of expertise (e.g., local or indigenous knowledge, engineering expertise, environmental science) and a forum for the implementation and evaluation of a sea-ice data and information framework. Alongside the International Polar Year 2007-08 and an associated boost in Arctic Ocean observation programs and platforms, there has been a movement towards new governance bodies that have these qualities and can play a central role in guiding the design and optimization of Arctic observing systems. To help further the development of such forums an evaluation of the density and spatial distribution of institutions, i.e., rule sets that govern ocean use, as well as the use of scenario planning and analysis can serve as

  20. Methods of a multi-faceted rapid knowledge synthesis project to inform the implementation of a new health service model: Collaborative Emergency Centres.

    Science.gov (United States)

    Hayden, Jill A; Killian, Lara; Zygmunt, Austin; Babineau, Jessica; Martin-Misener, Ruth; Jensen, Jan L; Carter, Alix J

    2015-01-14

    The aim of this rapid knowledge synthesis was to provide relevant research evidence to inform the implementation of a new health service in Nova Scotia, Canada: Collaborative Emergency Centres (CECs). CECs propose to deliver both primary and urgent care to rural populations where traditional delivery is a challenge. This paper reports on the methods used in a rapid knowledge synthesis project to provide timely evidence to policy makers about this novel healthcare delivery model. We used a variety of methods, including a jurisdictional/scoping review, modified systematic review methodologies, and integrated knowledge translation. We scanned publicly available information about similar centres across our country to identify important components of CECs and CEC-type models to operationalize the definition of a CEC. We conducted literature searches in PubMed, CINAHL, and EMBASE, and in the grey literature, to identify evidence on the key structures and processes and effectiveness of CEC-type models of care delivery. Our searches were limited to published systematic reviews. The research team facilitated two integrated knowledge translation workshops during the project to engage stakeholders, to refine the research goals and objectives, and to share interim and final results. Citations and included articles were categorized by whether they addressed the CEC model or component structures and processes. Data and key messages were extracted from these reviews to inform implementation. CEC-type models have limited peer-reviewed evidence available; no peer-reviewed studies on CECs as a standalone healthcare model were found. As a result, our evidence search and synthesis was revised to focus on core CEC-type structures and processes, prioritized through consensus methods with the stakeholder group, and resulted in provision of a meaningful evidence synthesis to help inform the development and implementation of CECs in Nova Scotia. A variety of methods and partnership with

  1. Bringing science from the top of the world to the rest of the world: using video to describe earthquake research in Nepal following the devastating 2015 M7.8 Gorkha earthquake

    Science.gov (United States)

    Karplus, M. S.; Barajas, A.; Garibay, L.

    2016-12-01

    In response to the April 25, 2015 M7.8 earthquake on the Main Himalayan Thrust in Nepal, NSF Geosciences funded a rapid seismological response project entitled NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake). This project included the deployment, maintenance, and demobilization of a network of 45 temporary seismic stations from June 2015 to May 2016. During the demobilization of the seismic network, video footage was recorded to tell the story of the NAMASTE team's seismic research in Nepal using short movies. In this presentation, we will describe these movies and discuss our strategies for effectively communicating this research to both the academic and general public with the goals of promoting earthquake hazards and international awareness and inspiring enthusiasm about learning and participating in science research. For example, an initial screening of these videos took place for an Introduction to Geology class at the University of Texas at El Paso to obtain feedback from approximately 100 first-year students with only a basic geology background. The feedback was then used to inform final cuts of the video suitable for a range of audiences, as well as to help guide future videography of field work. The footage is also being cut into a short, three-minute video to be featured on the website of The University of Texas at El Paso, home to several of the NAMASTE team researchers.

  2. Are Earthquakes a Critical Phenomenon?

    Science.gov (United States)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  3. Toward a comprehensive areal model of earthquake-induced landslides

    Science.gov (United States)

    Miles, S.B.; Keefer, D.K.

    2009-01-01

    This paper provides a review of regional-scale modeling of earthquake-induced landslide hazard with respect to the needs for disaster risk reduction and sustainable development. Based on this review, it sets out important research themes and suggests computing with words (CW), a methodology that includes fuzzy logic systems, as a fruitful modeling methodology for addressing many of these research themes. A range of research, reviewed here, has been conducted applying CW to various aspects of earthquake-induced landslide hazard zonation, but none facilitate comprehensive modeling of all types of earthquake-induced landslides. A new comprehensive areal model of earthquake-induced landslides (CAMEL) is introduced here that was developed using fuzzy logic systems. CAMEL provides an integrated framework for modeling all types of earthquake-induced landslides using geographic information systems. CAMEL is designed to facilitate quantitative and qualitative representation of terrain conditions and knowledge about these conditions on the likely areal concentration of each landslide type. CAMEL is highly modifiable and adaptable; new knowledge can be easily added, while existing knowledge can be changed to better match local knowledge and conditions. As such, CAMEL should not be viewed as a complete alternative to other earthquake-induced landslide models. CAMEL provides an open framework for incorporating other models, such as Newmark's displacement method, together with previously incompatible empirical and local knowledge. ?? 2009 ASCE.

  4. Debris flow susceptibility assessment after the 2008 Wenchuan earthquake

    Science.gov (United States)

    Fan, Xuanmei; van Westen, Cees; Tang, Chenxiao; Tang, Chuan

    2014-05-01

    Due to a tremendous amount of loose material from landslides that occurred during the Wenchuan earthquake, the frequency and magnitude of debris flows have been immensely increased, causing many casualties and economic losses. This study attempts to assess the post-earthquake debris flow susceptibility based on catchment units in the Wenchuan county, one of the most severely damaged county by the earthquake. The post earthquake debris flow inventory was created by RS image interpretation and field survey. According to our knowledge to the field, several relevant factors were determined as indicators for post-earthquake debris flow occurrence, including the distance to fault surface rupture, peak ground acceleration (PGA), coseismic landslide density, rainfall data, internal relief, slope, drainage density, stream steepness index, existing mitigation works etc. These indicators were then used as inputs in a heuristic model that was developed by adapting the Spatial Multi Criteria Evaluation (SMCE) method. The relative importance of the indicators was evaluated according to their contributions to the debris flow events that have occurred after the earthquake. The ultimate goal of this study is to estimate the relative likelihood of debris flow occurrence in each catchment, and use this result together with elements at risk and vulnerability information to assess the changing risk of the most susceptible catchment.

  5. GIS learning tool for world's largest earthquakes and their causes

    Science.gov (United States)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  6. Tien Shan Geohazards Database: Earthquakes and landslides

    Science.gov (United States)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  7. Learning from physics-based earthquake simulators: a minimal approach

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  8. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  9. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    van der Elst, Nicholas J.

    2017-11-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  10. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  11. The 2015 Gorkha Nepal Earthquake: Insights from Earthquake Damage Survey

    Directory of Open Access Journals (Sweden)

    Katsuichiro eGoda

    2015-06-01

    Full Text Available The 2015 Gorkha Nepal earthquake caused tremendous damage and loss. To gain valuable lessons from this tragic event, an earthquake damage investigation team was dispatched to Nepal from 1 May 2015 to 7 May 2015. A unique aspect of the earthquake damage investigation is that first-hand earthquake damage data were obtained 6 to 11 days after the mainshock. To gain deeper understanding of the observed earthquake damage in Nepal, the paper reviews the seismotectonic setting and regional seismicity in Nepal and analyzes available aftershock data and ground motion data. The earthquake damage observations indicate that the majority of the damaged buildings were stone/brick masonry structures with no seismic detailing, whereas the most of RC buildings were undamaged. This indicates that adequate structural design is the key to reduce the earthquake risk in Nepal. To share the gathered damage data widely, the collected damage data (geo-tagged photos and observation comments are organized using Google Earth and the kmz file is made publicly available.

  12. Does knowledge signify protection? The SEISMOPOLIS centre for improvement of behavior in case of an earthquake

    Science.gov (United States)

    Dandoulaki, M.; Kourou, A.; Panoutsopoulou, M.

    2009-04-01

    It is vastly accepted that earthquake education is the way to earthquake protection. Nonetheless experience demonstrates that knowing what to do does not necessarily result in a better behaviour in case of a real earthquake. A research project titled: "Seismopolis" - "Pilot integrated System for Public Familiarization with Earthquakes and Information on Earthquake Protection" aimed at the improvement of the behaviour of people through an appropriate amalgamation of knowledge transfer and virtually experiencing an earthquake situation. Seismopolis combines well established education means such as books and leaflets with new technologies like earthquake simulation and virtual reality. It comprises a series of 5 main spaces that the visitor passes one-by-one. Space 1. Reception and introductory information. Visitors are given fundamental information on earthquakes and earthquake protection, as well as on the appropriate behaviour in case of an earthquake. Space 2. Earthquake simulation room Visitors experience an earthquake in a room. A typical kitchen is set on a shake table area (3m x 6m planar triaxial shake table) and is shaken in both horizontal and vertical directions by introducing seismographs of real or virtual earthquakes. Space 3. Virtual reality room Visitors may have the opportunity to virtually move around in the building or in the city after an earthquake disaster and take action as in a real-life situation, wearing stereoscopic glasses and using navigation tools. Space 4. Information and resources library Visitors are offered the opportunity to know more about earthquake protection. A series of means are available for this, some developed especially for Seismopolis (3 books, 2 Cds, a website and an interactive table game). Space 5. De-briefing area Visitors may be subjected to a pedagogical and psychological evaluation at the end of their visit and offered support if needed. For the evaluation of the "Seismopolis" Centre, a pilot application of the

  13. Great earthquakes hazard in slow subduction zones

    Science.gov (United States)

    Marcaillou, B.; Gutscher, M.; Westbrook, G. K.

    2008-12-01

    Research on the Sumatra-Andaman earthquake of 2004 has challenged two popular paradigms; that the strongest subduction earthquakes strike in regions of rapid plate convergence and that rupture occurs primarily along the contact between the basement of the overriding plate and the downgoing plate. Subduction zones presenting similar structural and geodynamic characteristics (slow convergence and thick wedges of accreted sediment) may be capable of generating great megathrust earthquakes (M>8.5) despite an absence of thrust type earthquakes over the past 40 years. Existing deep seismic sounding data and hypocenters are used to constrain the geometry of several key slow subduction zones (Antilles, Hellenic, Sumatra). This geometry forms the basis for numerical modelling of fore-arc thermal structure, which is applied to calculate the estimated width of the seismogenic portion of the subduction fault plane. The margins with the thickest accretionary wedges are commonly found to have the widest (predicted) seismogenic zone. Furthermore, for these margins there exists a substantial (20-60 km wide) region above the up-dip limit for which the contribution to tsunami generation is poorly understood. As the rigidity (mu) of these high-porosity sediments is low, co-seismic slip here can be expected to be slow. Accordingly, the contribution to seismic moment will be low, but the contribution to tsunami generation may be very high. Indeed, recent seismological data from Nankai indicate very low frequency shallow-thrust earthquakes beneath this portion of the accretionary wedge, long-considered to be "aseismic". We propose that thick accumulations of sediment on the downgoing plate and the presence of a thick accretionary wedge can increase the maximum size of the potential rupture fault plane in two ways; 1) by thermally insulating the downgoing plate and thereby increasing the total downdip length of the fault which can rupture seismically and 2) by "smoothing out" the

  14. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  15. After an Earthquake: Accessing Near Real-Time Data in the Classroom

    Science.gov (United States)

    Bravo, T. K.; Coleman, B.; Hubenthal, M.; Owens, T. J.; Taber, J.; Welti, R.; Weertman, B. R.

    2010-12-01

    One of the best ways to engage students in scientific content is to give them opportunities to work with real scientific instruments and data and enable them to experience the discovery of scientific information. In addition, newsworthy earthquakes can capture the attention and imagination of students. IRIS and collaborating partners provide a range of options to leverage that attention through access to near-real-time earthquake location and waveform data stored in the IRIS Data Management System and elsewhere via a number of web-based tools and a new Java-based application. The broadest audience is reached by the Seismic Monitor, a simple Web-based tool for observing near-real-time seismicity. The IRIS Earthquake Browser (IEB) allows users to explore recent and cataloged earthquakes and aftershock patterns online with more flexibility, and K-12 classroom activities for understanding plate tectonics and estimating seismic hazards have been designed around its use. Waveforms are easily viewed and explored on the web using the Rapid Earthquake Viewer (REV), developed by the University of South Carolina in collaboration with IRIS E&O. Data from recent well-known earthquakes available via REV are used in exercises to determine Earth’s internal structure and to locate earthquakes. Three component data is presented to the students, allowing a much more realistic analysis of the data than is presented in most textbooks. The Seismographs in Schools program uses real-time data in the classroom to interest and engage students about recent earthquakes. Through the IRIS website, schools can share event data and 24-hr images. Additionally, data is available in real-time via the API. This API allows anyone to extract data, re-purpose it, and display it however they need to, as is being done by the British Geological Survey Seismographs in Schools program. Over 350 schools throughout the US and internationally are currently registered with the IRIS Seismographs in Schools

  16. How to assess magnitudes of paleo-earthquakes from multiple observations

    Science.gov (United States)

    Hintersberger, Esther; Decker, Kurt

    2016-04-01

    An important aspect of fault characterisation regarding seismic hazard assessment are paleo-earthquake magnitudes. Especially in regions with low or moderate seismicity, paleo-magnitudes are normally much larger than those of historical earthquakes and therefore provide essential information about seismic potential and expected maximum magnitudes of a certain region. In general, these paleo-earthquake magnitudes are based either on surface rupture length or on surface displacement observed at trenching sites. Several well-established correlations provide the possibility to link the observed surface displacement to a certain magnitude. However, the combination of more than one observation is still rare and not well established. We present here a method based on a probabilistic approach proposed by Biasi and Weldon (2006) to combine several observations to better constrain the possible magnitude range of a paleo-earthquake. Extrapolating the approach of Biasi and Weldon (2006), the single-observation probability density functions (PDF) are assumed to be independent of each other. Following this line, the common PDF for all observed surface displacements generated by one earthquake is the product of all single-displacement PDFs. In order to test our method, we use surface displacement data for modern earthquakes, where magnitudes have been determined by instrumental records. For randomly selected "observations", we calculated the associated PDFs for each "observation point". We then combined the PDFs into one common PDF for an increasing number of "observations". Plotting the most probable magnitudes against the number of combined "observations", the resultant range of most probable magnitudes is very close to the magnitude derived by instrumental methods. Testing our method with real trenching observations, we used the results of a paleoseismological investigation within the Vienna Pull-Apart Basin (Austria), where three trenches were opened along the normal

  17. Improving the extraction of crisis information in the context of flood, fire, and landslide rapid mapping using SAR and optical remote sensing data

    Science.gov (United States)

    Martinis, Sandro; Clandillon, Stephen; Twele, André; Huber, Claire; Plank, Simon; Maxant, Jérôme; Cao, Wenxi; Caspard, Mathilde; May, Stéphane

    2016-04-01

    Optical and radar satellite remote sensing have proven to provide essential crisis information in case of natural disasters, humanitarian relief activities and civil security issues in a growing number of cases through mechanisms such as the Copernicus Emergency Management Service (EMS) of the European Commission or the International Charter 'Space and Major Disasters'. The aforementioned programs and initiatives make use of satellite-based rapid mapping services aimed at delivering reliable and accurate crisis information after natural hazards. Although these services are increasingly operational, they need to be continuously updated and improved through research and development (R&D) activities. The principal objective of ASAPTERRA (Advancing SAR and Optical Methods for Rapid Mapping), the ESA-funded R&D project being described here, is to improve, automate and, hence, speed-up geo-information extraction procedures in the context of natural hazards response. This is performed through the development, implementation, testing and validation of novel image processing methods using optical and Synthetic Aperture Radar (SAR) data. The methods are mainly developed based on data of the German radar satellites TerraSAR-X and TanDEM-X, the French satellite missions Pléiades-1A/1B as well as the ESA missions Sentinel-1/2 with the aim to better characterize the potential and limitations of these sensors and their synergy. The resulting algorithms and techniques are evaluated in real case applications during rapid mapping activities. The project is focussed on three types of natural hazards: floods, landslides and fires. Within this presentation an overview of the main methodological developments in each topic is given and demonstrated in selected test areas. The following developments are presented in the context of flood mapping: a fully automated Sentinel-1 based processing chain for detecting open flood surfaces, a method for the improved detection of flooded vegetation

  18. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  19. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  20. The HayWired earthquake scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  1. InSAR Analysis of the 2011 Hawthorne (Nevada) Earthquake Swarm: Implications of Earthquake Migration and Stress Transfer

    Science.gov (United States)

    Zha, X.; Dai, Z.; Lu, Z.

    2015-12-01

    The 2011 Hawthorne earthquake swarm occurred in the central Walker Lane zone, neighboring the border between California and Nevada. The swarm included an Mw 4.4 on April 13, Mw 4.6 on April 17, and Mw 3.9 on April 27. Due to the lack of the near-field seismic instrument, it is difficult to get the accurate source information from the seismic data for these moderate-magnitude events. ENVISAT InSAR observations captured the deformation mainly caused by three events during the 2011 Hawthorne earthquake swarm. The surface traces of three seismogenic sources could be identified according to the local topography and interferogram phase discontinuities. The epicenters could be determined using the interferograms and the relocated earthquake distribution. An apparent earthquake migration is revealed by InSAR observations and the earthquake distribution. Analysis and modeling of InSAR data show that three moderate magnitude earthquakes were produced by slip on three previously unrecognized faults in the central Walker Lane. Two seismogenic sources are northwest striking, right-lateral strike-slip faults with some thrust-slip components, and the other source is a northeast striking, thrust-slip fault with some strike-slip components. The former two faults are roughly parallel to each other, and almost perpendicular to the latter one. This special spatial correlation between three seismogenic faults and nature of seismogenic faults suggest the central Walker Lane has been undergoing southeast-northwest horizontal compressive deformation, consistent with the region crustal movement revealed by GPS measurement. The Coulomb failure stresses on the fault planes were calculated using the preferred slip model and the Coulomb 3.4 software package. For the Mw4.6 earthquake, the Coulomb stress change caused by the Mw4.4 event increased by ~0.1 bar. For the Mw3.9 event, the Coulomb stress change caused by the Mw4.6 earthquake increased by ~1.0 bar. This indicates that the preceding

  2. Distributed Fiber Optic Sensing of Earthquake Wavefields

    Science.gov (United States)

    Lindsey, N.; Dreger, D. S.; Wagner, A. M.; James, S. R.; Ajo Franklin, J. B.

    2016-12-01

    Seismic hazard strongly depends on local site response, which is rarely captured by even the densest seismometer arrays. Using laser-based Rayleigh scattering with fiber optic telecommunication cables, seismic wavefield information can be recorded as strain-rate at a meter-scale resolution over 10's of kilometers, a technique known as distributed acoustic sensing (DAS). Recent active and passive DAS experiments confirm trade-offs in directionality and sensitivity compared with standard seismic sensors, however the possibility of using inexpensive fiber optics and a single instrument to characterize and monitor entire earthquake-prone regions with field-scale accuracy could represent a complimentary new direction for array seismology, seismic hazard analysis, and earthquake early warning. We present earthquake observations recorded using two different trenched fiber optic cables: (1) a 200m L-shaped test array at the Richmond Field Station in Richmond, CA; and (2) a 4000 linear meter sparse grid array at the Permafrost Experiment Station in Fairbanks, Alaska. We compare wavefield observations made using DAS to broadband recordings collocated with the two arrays, as well as spatial variations in amplitude and signal duration across the DAS arrays. We then perform 2-D beamforming across the array to locate the events. We discuss the quality of the wavefield reconstruction, the uncertainty in the beamformed hypocenter location, and implications for site response characterization.

  3. Spatial earthquake hazard assessment of Evansville, Indiana

    Science.gov (United States)

    Rockaway, T.D.; Frost, J.D.; Eggert, D.L.; Luna, R.

    1997-01-01

    The earthquake hazard has been evaluated for a 150-square-kilometer area around Evansville, Indiana. GIS-QUAKE, a system that combines liquefaction and ground motion analysis routines with site-specific geological, geotechnical, and seismological information, was used for the analysis. The hazard potential was determined by using 586 SPT borings, 27 CPT sounding, 39 shear-wave velocity profiles and synthesized acceleration records for body-wave magnitude 6.5 and 7.3 mid-continental earthquakes, occurring at distances of 50 km and 250 km, respectively. The results of the GIS-QUAKE hazard analyses for Evansville identify areas with a high hazard potential that had not previously been identified in earthquake zonation studies. The Pigeon Creek area specifically is identified as having significant potential for liquefaction-induced damage. Damage as a result of ground motion amplification is determined to be a moderate concern throughout the area. Differences in the findings of this zonation study and previous work are attributed to the size and range of the database, the hazard evaluation methodologies, and the geostatistical interpolation techniques used to estimate the hazard potential. Further, assumptions regarding the groundwater elevations made in previous studies are also considered to have had a significant effect on the results.

  4. Global Earthquake Casualties due to Secondary Effects: A Quantitative Analysis for Improving PAGER Losses

    Science.gov (United States)

    Wald, David J.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.

  5. Ecological assessment of the marine ecosystems of Barbuda, West Indies: Using rapid scientific assessment to inform ocean zoning and fisheries management.

    Science.gov (United States)

    Ruttenberg, Benjamin; Caselle, Jennifer E; Estep, Andrew J; Johnson, Ayana Elizabeth; Marhaver, Kristen L; Richter, Lee J; Sandin, Stuart A; Vermeij, Mark J A; Smith, Jennifer E; Grenda, David; Cannon, Abigail

    2018-01-01

    To inform a community-based ocean zoning initiative, we conducted an intensive ecological assessment of the marine ecosystems of Barbuda, West Indies. We conducted 116 fish and 108 benthic surveys around the island, and measured the abundance and size structure of lobsters and conch at 52 and 35 sites, respectively. We found that both coral cover and fish biomass were similar to or lower than levels observed across the greater Caribbean; live coral cover and abundance of fishery target species, such as large snappers and groupers, was generally low. However, Barbuda lacks many of the high-relief forereef areas where similar work has been conducted in other Caribbean locations. The distribution of lobsters was patchy, making it difficult to quantify density at the island scale. However, the maximum size of lobsters was generally larger than in other locations in the Caribbean and similar to the maximum size reported 40 years ago. While the lobster population has clearly been heavily exploited, our data suggest that it is not as overexploited as in much of the rest of the Caribbean. Surveys of Barbuda's Codrington Lagoon revealed many juvenile lobsters, but none of legal size (95 mm carapace length), suggesting that the lagoon functions primarily as nursery habitat. Conch abundance and size on Barbuda were similar to that of other Caribbean islands. Our data suggest that many of the regional threats observed on other Caribbean islands are present on Barbuda, but some resources-particularly lobster and conch-may be less overexploited than on other Caribbean islands. Local management has the potential to provide sustainability for at least some of the island's marine resources. We show that a rapid, thorough ecological assessment can reveal clear conservation opportunities and facilitate rapid conservation action by providing the foundation for a community-driven policymaking process at the island scale.

  6. The Alaska earthquake, March 27, 1964: lessons and conclusions

    Science.gov (United States)

    Eckel, Edwin B.

    1970-01-01

    subsidence was superimposed on regional tectonic subsidence to heighten the flooding damage. Ground and surface waters were measurably affected by the earthquake, not only in Alaska but throughout the world. Expectably, local geologic conditions largely controlled the extent of structural damage, whether caused directly by seismic vibrations or by secondary effects such as those just described. Intensity was greatest in areas underlain by thick saturated unconsolidated deposits, least on indurated bedrock or permanently frozen ground, and intermediate on coarse well-drained gravel, on morainal deposits, or on moderately indurated sedimentary rocks. Local and even regional geology also controlled the distribution and extent of the earthquake's effects on hydrologic systems. In the conterminous United States, for example, seiches in wells and bodies of surface water were controlled by geologic structures of regional dimension. Devastating as the earthquake was, it had many long-term beneficial effects. Many of these were socioeconomic or engineering in nature; others were of scientific value. Much new and corroborative basic geologic and hydrologic information was accumulated in the course of the earthquake studies, and many new or improved investigative techniques were developed. Chief among these, perhaps, were the recognition that lakes can be used as giant tiltmeters, the refinement of methods for measuring land-level changes by observing displacements of barnacles and other sessile organisms, and the relating of hydrology to seismology by worldwide study of hydroseisms in surface-water bodies and in wells. The geologic and hydrologic lessons learned from studies of the Alaska earthquake also lead directly to better definition of the research needed to further our understanding of earthquakes and of how to avoid or lessen the effects of future ones. Research is needed on the origins and mechanisms of earthquakes, on crustal structure, and on the generation of tsunamis and

  7. Complex networks of earthquakes and aftershocks

    Directory of Open Access Journals (Sweden)

    M. Baiesi

    2005-01-01

    Full Text Available We invoke a metric to quantify the correlation between any two earthquakes. This provides a simple and straightforward alternative to using space-time windows to detect aftershock sequences and obviates the need to distinguish main shocks from aftershocks. Directed networks of earthquakes are constructed by placing a link, directed from the past to the future, between pairs of events that are strongly correlated. Each link has a weight giving the relative strength of correlation such that the sum over the incoming links to any node equals unity for aftershocks, or zero if the event had no correlated predecessors. A correlation threshold is set to drastically reduce the size of the data set without losing significant information. Events can be aftershocks of many previous events, and also generate many aftershocks. The probability distribution for the number of incoming and outgoing links are both scale free, and the networks are highly clustered. The Omori law holds for aftershock rates up to a decorrelation time that scales with the magnitude, m, of the initiating shock as tcutoff~10β m with β~-3/4. Another scaling law relates distances between earthquakes and their aftershocks to the magnitude of the initiating shock. Our results are inconsistent with the hypothesis of finite aftershock zones. We also find evidence that seismicity is dominantly triggered by small earthquakes. Our approach, using concepts from the modern theory of complex networks, together with a metric to estimate correlations, opens up new avenues of research, as well as new tools to understand seismicity.

  8. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  9. Fracking, wastewater disposal, and earthquakes

    Science.gov (United States)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  10. Fault Structural Control on Earthquake Strong Ground Motions: The 2008 Wenchuan Earthquake as an Example

    Science.gov (United States)

    Zhang, Yan; Zhang, Dongli; Li, Xiaojun; Huang, Bei; Zheng, Wenjun; Wang, Yuejun

    2017-12-01

    Continental thrust faulting earthquakes pose severe threats to megacities across the world. Recent events show the possible control of fault structures on strong ground motions. The seismogenic structure of the 2008 Wenchuan earthquake is associated with high-angle listric reverse fault zones. Its peak ground accelerations (PGAs) show a prominent feature of fault zone amplification: the values within the 30- to 40-km-wide fault zone block are significantly larger than those on both the hanging wall and the footwall. The PGA values attenuate asymmetrically: they decay much more rapidly in the footwall than in the hanging wall. The hanging wall effects can be seen on both the vertical and horizontal components of the PGAs, with the former significantly more prominent than the latter. All these characteristics can be adequately interpreted by upward extrusion of the high-angle listric reverse fault zone block. Through comparison with a low-angle planar thrust fault associated with the 1999 Chi-Chi earthquake, we conclude that different fault structures might have controlled different patterns of strong ground motion, which should be taken into account in seismic design and construction.

  11. Effect of tectonic setting on the fit and performance of a long-range earthquake forecasting model

    Directory of Open Access Journals (Sweden)

    David Alan Rhoades

    2012-02-01

    Full Text Available The Every Earthquake a Precursor According to Scale (EEPAS long-range earthquake forecasting model has been shown to be informative in several seismically active regions, including New Zealand, California and Japan. In previous applications of the model, the tectonic setting of earthquakes has been ignored. Here we distinguish crustal, plate interface, and slab earthquakes and apply the model to earthquakes with magnitude M≥4 in the Japan region from 1926 onwards. The target magnitude range is M≥ 6; the fitting period is 1966-1995; and the testing period is 1996-2005. In forecasting major slab earthquakes, it is optimal to use only slab and interface events as precursors. In forecasting major interface events, it is optimal to use only interface events as precursors. In forecasting major crustal events, it is optimal to use only crustal events as precursors. For the smoothed-seismicity component of the EEPAS model, it is optimal to use slab and interface events for earthquakes in the slab, interface events only for earthquakes on the interface, and crustal and interface events for crustal earthquakes. The optimal model parameters indicate that the precursor areas for slab earthquakes are relatively small compared to those for earthquakes in other tectonic categories, and that the precursor times and precursory earthquake magnitudes for crustal earthquakes are relatively large. The optimal models fit the learning data sets better than the raw EEPAS model, with an average information gain per earthquake of about 0.4. The average information gain is similar in the testing period, although it is higher for crustal earthquakes and lower for slab and interface earthquakes than in the learning period. These results show that earthquake interactions are stronger between earthquakes of similar tectonic types and that distinguishing tectonic types improves forecasts by enhancing the depth resolution where tectonic categories of earthquakes are

  12. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei

    2017-01-01

    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  13. Short-term earthquake probabilities during the L'Aquila earthquake sequence in central Italy, 2009

    Science.gov (United States)

    Falcone, G.; Murru, M.; Zhuang, J.; Console, R.

    2014-12-01

    We compare the forecasting performance of several statistical models, which are used to describe the occurrence process of earthquakes, in forecasting the short-term earthquake probabilities during the occurrence of the L'Aquila earthquake sequence in central Italy, 2009. These models include the Proximity to Past Earthquakes (PPE) model and different versions of the Epidemic Type Aftershock Sequence (ETAS) model. We used the information gains corresponding to the Poisson and binomial scores to evaluate the performance of these models. It is shown that all ETAS models work better than the PPE model. However, when comparing the different types of the ETAS models, the one with the same fixed exponent coefficient α = 2.3 for both the productivity function and the scaling factor in the spatial response function, performs better in forecasting the active aftershock sequence than the other models with different exponent coefficients when the Poisson score is adopted. These latter models perform only better when a lower magnitude threshold of 2.0 and the binomial score are used. The reason is likely due to the fact that the catalog does not contain an event of magnitude similar to the L'Aquila main shock (Mw 6.3) in the training period (April 16, 2005 to March 15, 2009). In this case the a-value is under-estimated and thus also the forecasted seismicity is underestimated when the productivity function is extrapolated to high magnitudes. These results suggest that the training catalog used for estimating the model parameters should include earthquakes of similar magnitudes as the main shock when forecasting seismicity is during an aftershock sequences.

  14. Seismology: dynamic triggering of earthquakes.

    Science.gov (United States)

    Gomberg, Joan; Johnson, Paul

    2005-10-06

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  15. Earthquakes, July-August 1992

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤Mearthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  16. Response of Alum Rock springs to the October 30, 2007 Alum Rock earthquake and implications for the origin of increased discharge after earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Rowland, Joel C [Los Alamos National Laboratory; Manga, Michael [UC BERKELEY

    2009-01-01

    The origin of increased stream flow and spring discharge following earthquakes have been the subject of controversy, in large part because there are many models to explain observations and few measurements suitable for distinguishing between hypotheses. On October 30, 2007 a magnitude 5.5 earthquake occurred near the Alum Rock springs, California, USA. Within a day we documented a several-fold increase in discharge. Over the following year, we have monitored a gradual return towards pre-earthquake properties, but for the largest springs there appears to be a permanent increase in the steady discharge at all the springs. The Alum Rock springs discharge waters that represent a mixture between modern ('shallow') meteoric water and old ('deep') connate waters expelled by regional transpression. After the earthquake, the increased discharge at the largest springs was accompanied by a small decrease in the fraction of connate water in the spring discharge. Combined with the rapid response, this implies that the increased discharge has a shallow origin. Increased discharge at these springs occurs for earthquakes that cause static volumetric expansion and those that cause contraction, supporting models in which dynamic strains are responsible for the subsurface changes that cause flow to increase. We show that models in which the permeability of the fracture system feeding the springs increases after the earthquake are in general consistent with the changes in discharge. The response of these springs to another earthquake will provide critical constraints on the changes that occur in the subsurface.

  17. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  18. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  19. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    Science.gov (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  20. [Clinical characteristics of pediatric victims in the Lushan and Wenchuan earthquakes and experience of medical rescue].

    Science.gov (United States)

    Jiang, Xin; Xiang, Bo; Liu, Li-Jun; Liu, Min; Tang, Xue-Yang; Huang, Lu-Gang; Li, Yuan; Peng, Ming-Xing; Xin, Wen-Qiong

    2013-06-01

    To get a more comprehensive understanding of the clinical characteristics of pediatric victims in earthquake and to summarize the experience of medical rescue. The clinical information was collected from the pediatric victims who were admitted to West China Hospital, Sichuan University following the Lushan earthquake in 2013 and Wenchuan earthquake in 2008. The clinical data were compared between the pediatric victims in the two earthquakes. Thirty-four children under 14 years of age, who were injured in the Lushan earthquake, were admitted to the West China Hospital before April 30, 2013. Compared with the data in the Wenchuan earthquake, the mean age of the pediatric victims in the Lushan earthquake was significantly lower (Pearthquake to hospitalization was significantly shorter (Pearthquake, 67.6% of the injured children had variable limb fractures; traumatic brain injury was found in 29.4% of hospitalized children, versus 9.5% in the Wenchuan earthquake (Pearthquake than in the Wenchuan earthquake. But these cases recovered well, which was possibly due to timely on-site rescue and transfer and multi-sector, multi-institution, and multidisciplinary cooperation.