WorldWideScience

Sample records for rapid visual earthquake

  1. Seismogeodesy for rapid earthquake and tsunami characterization

    Science.gov (United States)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  2. The 2010 Chile Earthquake: Rapid Assessments of Tsunami

    OpenAIRE

    Michelini, A.; Lauciani, V.; Selvaggi, G.; Lomax, A.

    2010-01-01

    After an earthquake underwater, rapid real-time assessment of earthquake parameters is important for emergency response related to infrastructure damage and, perhaps more exigently, for issuing warnings of the possibility of an impending tsunami. Since 2005, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) has worked on the rapid quantification of earthquake magnitude and tsunami potential, especially for the Mediterranean area. This work includes quantification of earthquake size fr...

  3. Interactive visualization to advance earthquake simulation

    Science.gov (United States)

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  4. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

    Science.gov (United States)

    Polet, J.; Thio, H. K.; Kremer, M.

    2009-12-01

    The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure

  5. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  6. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    Science.gov (United States)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  7. Rapid earthquake magnitude determination for Vrancea early warning system

    International Nuclear Information System (INIS)

    Marmureanu, Alexandru

    2009-01-01

    Due to the huge amount of recorded data, an automatic procedure was developed and used to test different methods to rapidly evaluate earthquake magnitude from the first seconds of the P wave. In order to test all the algorithms involved in detection and rapid earthquake magnitude estimation, several tests were performed, in order to avoid false alarms. A special detection algorithm was developed, that is based on the classical STA/LTA algorithm and tuned for early warning purpose. A method to rapidly estimate magnitude in 4 seconds from detection of P wave in the epicenter is proposed. The method was tested on al recorded data, and the magnitude error determination is acceptable taking into account that it is computed from only 3 stations in a very short time interval. (author)

  8. The key role of eyewitnesses in rapid earthquake impact assessment

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  9. Earthquakes, fluid pressures and rapid subduction zone metamorphism

    Science.gov (United States)

    Viete, D. R.

    2013-12-01

    High-pressure/low-temperature (HP/LT) metamorphism is commonly incomplete, meaning that large tracts of rock can remain metastable at blueschist- and eclogite-facies conditions for timescales up to millions of years [1]. When HP/LT metamorphism does take place, it can occur over extremely short durations (the role of fluids in providing heat for metamorphism [2] or catalyzing metamorphic reactions [1]. Earthquakes in subduction zone settings can occur to depths of 100s of km. Metamorphic dehydration and the associated development of elevated pore pressures in HP/LT metamorphic rocks has been identified as a cause of earthquake activity at such great depths [3-4]. The process of fracturing/faulting significantly increases rock permeability, causing channelized fluid flow and dissipation of pore pressures [3-4]. Thus, deep subduction zone earthquakes are thought to reflect an evolution in fluid pressure, involving: (1) an initial increase in pore pressure by heating-related dehydration of subduction zone rocks, and (2) rapid relief of pore pressures by faulting and channelized flow. Models for earthquakes at depth in subduction zones have focussed on the in situ effects of dehydration and then sudden escape of fluids from the rock mass following fracturing [3-4]. On the other hand, existing models for rapid and incomplete metamorphism in subduction zones have focussed only on the effects of heating and/or hydration with the arrival of external fluids [1-2]. Significant changes in pressure over very short timescales should result in rapid mineral growth and/or disequilibrium texture development in response to overstepping of mineral reaction boundaries. The repeated process of dehydration-pore pressure development-earthquake-pore pressure relief could conceivably produce a record of episodic HP/LT metamorphism driven by rapid pressure pulses. A new hypothesis is presented for the origins of HP/LT metamorphism: that HP/LT metamorphism is driven by effective pressure

  10. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    Science.gov (United States)

    Chang, Jefferson C.; Lockner, David A.; Reches, Z.

    2012-01-01

    After nucleation, a large earthquake propagates as an expanding rupture front along a fault. This front activates countless fault patches that slip by consuming energy stored in Earth’s crust. We simulated the slip of a fault patch by rapidly loading an experimental fault with energy stored in a spinning flywheel. The spontaneous evolution of strength, acceleration, and velocity indicates that our experiments are proxies of fault-patch behavior during earthquakes of moment magnitude (Mw) = 4 to 8. We show that seismically determined earthquake parameters (e.g., displacement, velocity, magnitude, or fracture energy) can be used to estimate the intensity of the energy release during an earthquake. Our experiments further indicate that high acceleration imposed by the earthquake’s rupture front quickens dynamic weakening by intense wear of the fault zone.

  11. Rapid serial visual presentation design for cognition

    CERN Document Server

    Spence, Robert

    2013-01-01

    A powerful new image presentation technique has evolved over the last twenty years, and its value demonstrated through its support of many and varied common tasks. Conceptually, Rapid Serial Visual Presentation (RSVP) is basically simple, exemplified in the physical world by the rapid riffling of the pages of a book in order to locate a known image. Advances in computation and graphics processing allow RSVP to be applied flexibly and effectively to a huge variety of common tasks such as window shopping, video fast-forward and rewind, TV channel selection and product browsing. At its heart is a

  12. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    Science.gov (United States)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  13. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  14. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Science.gov (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  15. Similarity relations in visual search predict rapid visual categorization

    Science.gov (United States)

    Mohan, Krithika; Arun, S. P.

    2012-01-01

    How do we perform rapid visual categorization?It is widely thought that categorization involves evaluating the similarity of an object to other category items, but the underlying features and similarity relations remain unknown. Here, we hypothesized that categorization performance is based on perceived similarity relations between items within and outside the category. To this end, we measured the categorization performance of human subjects on three diverse visual categories (animals, vehicles, and tools) and across three hierarchical levels (superordinate, basic, and subordinate levels among animals). For the same subjects, we measured their perceived pair-wise similarities between objects using a visual search task. Regardless of category and hierarchical level, we found that the time taken to categorize an object could be predicted using its similarity to members within and outside its category. We were able to account for several classic categorization phenomena, such as (a) the longer times required to reject category membership; (b) the longer times to categorize atypical objects; and (c) differences in performance across tasks and across hierarchical levels. These categorization times were also accounted for by a model that extracts coarse structure from an image. The striking agreement observed between categorization and visual search suggests that these two disparate tasks depend on a shared coarse object representation. PMID:23092947

  16. A rapid stability assessment of China's IGS sites after the Ms7. 0 Lushan earthquake

    Directory of Open Access Journals (Sweden)

    Meng Jie

    2013-05-01

    Full Text Available A rapid and accurate assessment of the stability of surveying and mapping reference points is important for post – disaster rescue, disaster relief and reconstruction activities. Using Precise Point Positioning (PPP technology, a rapid assessment of the stability of the IGS sites in China was performed after the Ms 7. 0 Lushan earthquake using rapid precise ephemeris and rapid precise satellite clock products. The results show that the earthquake had a very small impact and did not cause significant permanent deformation at the IGS sites. Most of the sites were unaffected and remained stable after the earthquake.

  17. Money matters: Rapid post-earthquake financial decision-making

    Science.gov (United States)

    Wald, David J.; Franco, Guillermo

    2016-01-01

    Post-earthquake financial decision-making is a realm beyond that of many people. In the immediate aftermath of a damaging earthquake, billions of dollars of relief, recovery, and insurance funds are in the balance through new financial instruments that allow those with resources to hedge against disasters and those at risk to limit their earthquake losses and receive funds for response and recovery.

  18. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    Science.gov (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  19. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    Science.gov (United States)

    Chang, J. C.; Lockner, D. A.; Reches, Z.

    2012-12-01

    We simulated the slip of a fault-patch during a large earthquake by rapidly loading an experimental, ring-shaped fault with energy stored in a spinning flywheel. The flywheel abruptly delivers a finite amount of energy by spinning the fault-patch that spontaneously dissipates the energy without operator intervention. We conducted 42 experiments on Sierra White granite (SWG) samples, and 24 experiments on Kasota dolomite (KD) samples. Each experiment starts by spinning a 225 kg disk-shaped flywheel to a prescribed angular velocity. We refer to this experiment as an "earthquake-like slip-event" (ELSE). The strength-evolution in ELSE experiments is similar to the strength-evolution proposed for earthquake models and observed in stick-slip experiments. Further, we found that ELSE experiments are similar to earthquakes in at least three ways: (1) slip driven by the release of a finite amount of stored energy; (2) pattern of fault strength evolution; and (3) seismically observed values, such as average slip, peak-velocity and rise-time. By assuming that the measured slip, D, in ELSE experiments is equivalent to the average slip during an earthquake, we found that ELSE experiments (D = 0.003-4.6 m) correspond to earthquakes in moment-magnitude range of Mw = 4-8. In ELSE experiments, the critical-slip-distance, dc, has mean values of 2.7 cm and 1.2 cm for SWG and KD, that are much shorter than the 1-10 m in steady-state classical experiments in rotary shear systems. We attribute these dc values, to ELSE loading in which the fault-patch is abruptly loaded by impact with a spinning flywheel. Under this loading, the friction-velocity relations are strikingly different from those under steady-state loading on the same rock samples with the same shear system (Reches and Lockner, Nature, 2010). We further note that the slip acceleration in ELSE evolves systematically with fault strength and wear-rate, and that the dynamic weakening is restricted to the period of intense

  20. Rapid learning in visual cortical networks.

    Science.gov (United States)

    Wang, Ye; Dragoi, Valentin

    2015-08-26

    Although changes in brain activity during learning have been extensively examined at the single neuron level, the coding strategies employed by cell populations remain mysterious. We examined cell populations in macaque area V4 during a rapid form of perceptual learning that emerges within tens of minutes. Multiple single units and LFP responses were recorded as monkeys improved their performance in an image discrimination task. We show that the increase in behavioral performance during learning is predicted by a tight coordination of spike timing with local population activity. More spike-LFP theta synchronization is correlated with higher learning performance, while high-frequency synchronization is unrelated with changes in performance, but these changes were absent once learning had stabilized and stimuli became familiar, or in the absence of learning. These findings reveal a novel mechanism of plasticity in visual cortex by which elevated low-frequency synchronization between individual neurons and local population activity accompanies the improvement in performance during learning.

  1. Experimental visualization of rapid maneuvering fish

    Science.gov (United States)

    Daigh, S.; Techet, A. H.

    2003-11-01

    A freshwater tropical fish, Danio aequippinatus, is studied undergoing rapid turning and fast starting maneuvers. This agile species of fish is ideal for this study as it is capable of quick turning and darting motions up to 5g's. The fgish studied are 4-5 cm in length. The speed and kinematics of the maneuvering is determined by video analysis. Planar and stereo Particle Image Velocimetry (PIV) is used to map the vortical patterns in the wake of the maneuvering fish. PIV visualizations reveal that during C-shaped maneuvers a ring shaped jet vortex is formed. Fast starting behavior is also presented. PIV data is used to approixmate the thrust vectoring force produced during each maneuver.

  2. Rapid Extraction of Landslide and Spatial Distribution Analysis after Jiuzhaigou Ms7.0 Earthquake Based on Uav Images

    Science.gov (United States)

    Jiao, Q. S.; Luo, Y.; Shen, W. H.; Li, Q.; Wang, X.

    2018-04-01

    Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV) and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented analysis method, landslides image objects were obtained by multi-scale segmentation, and the feature rule set of each level was automatically built by SEaTH (Separability and Thresholds) algorithm to realize the rapid landslide extraction. Compared with visual interpretation, object-oriented automatic landslides extraction method achieved an accuracy of 94.3 %. The spatial distribution of the earthquake landslide had a significant positive correlation with slope and relief and had a negative correlation with the roughness, but no obvious correlation with the aspect. The relationship between the landslide and the aspect was not found and the probable reason may be that the distance between the study area and the seismogenic fault was too far away. This work provided technical support for the earthquake field emergency, earthquake landslide prediction and disaster loss assessment.

  3. RAPID EXTRACTION OF LANDSLIDE AND SPATIAL DISTRIBUTION ANALYSIS AFTER JIUZHAIGOU Ms7.0 EARTHQUAKE BASED ON UAV IMAGES

    Directory of Open Access Journals (Sweden)

    Q. S. Jiao

    2018-04-01

    Full Text Available Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented analysis method, landslides image objects were obtained by multi-scale segmentation, and the feature rule set of each level was automatically built by SEaTH (Separability and Thresholds algorithm to realize the rapid landslide extraction. Compared with visual interpretation, object-oriented automatic landslides extraction method achieved an accuracy of 94.3 %. The spatial distribution of the earthquake landslide had a significant positive correlation with slope and relief and had a negative correlation with the roughness, but no obvious correlation with the aspect. The relationship between the landslide and the aspect was not found and the probable reason may be that the distance between the study area and the seismogenic fault was too far away. This work provided technical support for the earthquake field emergency, earthquake landslide prediction and disaster loss assessment.

  4. Rapid estimation of the moment magnitude of large earthquake from static strain changes

    Science.gov (United States)

    Itaba, S.

    2014-12-01

    The 2011 off the Pacific coast of Tohoku earthquake, of moment magnitude (Mw) 9.0, occurred on March 11, 2011. Based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency announced just after earthquake occurrence was 7.9, and it was considerably smaller than an actual value. On the other hand, using nine borehole strainmeters of Geological Survey of Japan, AIST, we estimated a fault model with Mw 8.7 for the earthquake on the boundary between the Pacific and North American plates. This model can be estimated about seven minutes after the origin time, and five minute after wave arrival. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami (e.g., Ohta et al., 2012). Our simple method of using strain steps is one of the strong methods for rapid estimation of the magnitude of great earthquakes.

  5. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China.

    Science.gov (United States)

    Liu, Xu; Tang, Bihan; Yang, Hongyang; Liu, Yuan; Xue, Chen; Zhang, Lulu

    2015-12-04

    Performance assessments of earthquake medical rapid response teams (EMRRTs), particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  6. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China

    Directory of Open Access Journals (Sweden)

    Xu Liu

    2015-12-01

    Full Text Available Purpose: Performance assessments of earthquake medical rapid response teams (EMRRTs, particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Methods: Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. Results: The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. Conclusions: This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  7. Real-time earthquake monitoring: Early warning and rapid response

    Science.gov (United States)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  8. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  9. LastQuake: a comprehensive strategy for rapid engagement of earthquake eyewitnesses, massive crowdsourcing and risk reduction

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Steed, R.; Frobert, L.

    2015-12-01

    LastQuake is a smartphone app, browser add-on and the most sophisticated Twitter robot (quakebot) for earthquakes currently in operation. It fulfills eyewitnesses' needs by offering information on felt earthquakes and their effects within tens of seconds of their occurrence. Associated with an active presence on Facebook, Pinterest and on websites, this proves a very efficient engagement strategy. For example, the app was installed thousands of times after the Ghorka earthquake in Nepal. Language barriers have been erased by using visual communication; for example, felt reports are collected through a set of cartoons representing different shaking levels. Within 3 weeks of the magnitude 7.8 Ghorka earthquakes, 7,000 felt reports with thousands of comments were collected related to the mainshock and tens of its aftershocks as well as 100 informative geo-located pics. The QuakeBot was essential in allowing us to be identified so well and interact with those affected. LastQuake is also a risk reduction tool since it provides rapid information. Rapid information is similar to prevention since when it does not exist, disasters can happen. When no information is available after a felt earthquake, the public block emergency lines by trying to find out the cause of the shaking, crowds form potentially leading to unpredictable crowd movement, rumors spread. In its next release LastQuake will also provide people with guidance immediately after a shaking through a number of pop-up cartoons illustrating "do/don't do" items (go to open places, do not phone emergency services except if people are injured…). LastQuake's app design is simple and intuitive and has a global audience. It benefited from a crowdfunding campaign (and the support of the Fondation MAIF) and more improvements have been planned after an online feedback campaign organized in early June with the Ghorka earthquake eyewitnesses. LastQuake is also a seismic risk reduction tools thanks to its very rapid

  10. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  11. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  12. Rapid Large Earthquake and Run-up Characterization in Quasi Real Time

    Science.gov (United States)

    Bravo, F. J.; Riquelme, S.; Koch, P.; Cararo, S.

    2017-12-01

    Several test in quasi real time have been conducted by the rapid response group at CSN (National Seismological Center) to characterize earthquakes in Real Time. These methods are known for its robustness and realibility to create Finite Fault Models. The W-phase FFM Inversion, The Wavelet Domain FFM and The Body Wave and FFM have been implemented in real time at CSN, all these algorithms are running automatically and triggered by the W-phase Point Source Inversion. Dimensions (Large and Width ) are predefined by adopting scaling laws for earthquakes in subduction zones. We tested the last four major earthquakes occurred in Chile using this scheme: The 2010 Mw 8.8 Maule Earthquake, The 2014 Mw 8.2 Iquique Earthquake, The 2015 Mw 8.3 Illapel Earthquake and The 7.6 Melinka Earthquake. We obtain many solutions as time elapses, for each one of those we calculate the run-up using an analytical formula. Our results are in agreements with some FFM already accepted by the sicentific comunnity aswell as run-up observations in the field.

  13. Cluster Oriented Spatio Temporal Multidimensional Data Visualization of Earthquakes in Indonesia

    Directory of Open Access Journals (Sweden)

    Mohammad Nur Shodiq

    2016-03-01

    Full Text Available Spatio temporal data clustering is challenge task. The result of clustering data are utilized to investigate the seismic parameters. Seismic parameters are used to describe the characteristics of earthquake behavior. One of the effective technique to study multidimensional spatio temporal data is visualization. But, visualization of multidimensional data is complicated problem. Because, this analysis consists of observed data cluster and seismic parameters. In this paper, we propose a visualization system, called as IES (Indonesia Earthquake System, for cluster analysis, spatio temporal analysis, and visualize the multidimensional data of seismic parameters. We analyze the cluster analysis by using automatic clustering, that consists of get optimal number of cluster and Hierarchical K-means clustering. We explore the visual cluster and multidimensional data in low dimensional space visualization. We made experiment with observed data, that consists of seismic data around Indonesian archipelago during 2004 to 2014. Keywords: Clustering, visualization, multidimensional data, seismic parameters.

  14. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  15. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2011-12-01

    Rapid characterisation of earthquake effects is essential for a timely and appropriate response in favour of victims and/or of eyewitnesses. In case of damaging earthquakes, any field observations that can fill the information gap characterising their immediate aftermath can contribute to more efficient rescue operations. This paper presents the last developments of a method called "flash-sourcing" addressing these issues. It relies on eyewitnesses, the first informed and the first concerned by an earthquake occurrence. More precisely, their use of the EMSC earthquake information website (www.emsc-csem.org) is analysed in real time to map the area where the earthquake was felt and identify, at least under certain circumstances zones of widespread damage. The approach is based on the natural and immediate convergence of eyewitnesses on the website who rush to the Internet to investigate cause of the shaking they just felt causing our traffic to increase The area where an earthquake was felt is mapped simply by locating Internet Protocol (IP) addresses during traffic surges. In addition, the presence of eyewitnesses browsing our website within minutes of an earthquake occurrence excludes the possibility of widespread damage in the localities they originate from: in case of severe damage, the networks would be down. The validity of the information derived from this clickstream analysis is confirmed by comparisons with EMS98 macroseismic map obtained from online questionnaires. The name of this approach, "flash-sourcing", is a combination of "flash-crowd" and "crowdsourcing" intending to reflect the rapidity of the data collation from the public. For computer scientists, a flash-crowd names a traffic surge on a website. Crowdsourcing means work being done by a "crowd" of people; It also characterises Internet and mobile applications collecting information from the public such as online macroseismic questionnaires. Like crowdsourcing techniques, flash-sourcing is a

  16. Conceptualizing ¬the Abstractions of Earthquakes Through an Instructional Sequence Using SeisMac and the Rapid Earthquake Viewer

    Science.gov (United States)

    Taber, J.; Hubenthal, M.; Wysession, M.

    2007-12-01

    Newsworthy earthquakes provide an engaging hook for students in Earth science classes, particularly when discussing their effects on people and the landscape. However, engaging students in an analysis of earthquakes that extends beyond death and damage, is frequently hampered by the abstraction of recorded ground motion data in the form of raw seismograms and the inability of most students to personally relate to ground accelerations. To overcome these challenges, an educational sequence has been developed using two software tools: SeisMac by Daniel Griscom, and the Rapid Earthquake Viewer (REV) developed by the University of South Carolina in collaboration with IRIS and DLESE. This sequence presents a unique opportunity for Earth Science teachers to "create" foundational experiences for students as they construction a framework of understanding of abstract concepts. The first activity is designed to introduce the concept of a three-component seismogram and to directly address the very abstract nature of seismograms through a kinesthetic experience. Students first learn to take the pulse of their classroom through a guided exploration of SeisMac, which displays the output of the laptop's built-in Sudden Motion Sensor (a 3-component accelerometer). This exploration allows students to view a 3-component seismogram as they move or tap the laptop and encourages them to propose and carry out experiments to explain the meaning of the 3-component seismogram. Once completed students are then asked to apply this new knowledge to a real 3-component seismogram printed from REV. Next the activity guides students through the process of identifying P and S waves and using SeisMac to connect the physical motion of the laptop to the "wiggles" they see on the SeisMac display and then comparing those to the "wiggles" they see on their seismogram. At this point students are more fully prepared to engage in an S-P location exercise such as those included in many state standards

  17. Rapid penetration into granular media visualizing the fundamental physics of rapid earth penetration

    CERN Document Server

    Iskander, Magued

    2015-01-01

    Rapid Penetration into Granular Media: Visualizing the Fundamental Physics of Rapid Earth Penetration introduces readers to the variety of methods and techniques used to visualize, observe, and model the rapid penetration of natural and man-made projectiles into earth materials. It provides seasoned practitioners with a standard reference that showcases the topic's most recent developments in research and application. The text compiles the findings of new research developments on the subject, outlines the fundamental physics of rapid penetration into granular media, and assembles a com

  18. Recent applications for rapid estimation of earthquake shaking and losses with ELER Software

    International Nuclear Information System (INIS)

    Demircioglu, M.B.; Erdik, M.; Kamer, Y.; Sesetyan, K.; Tuzun, C.

    2012-01-01

    A methodology and software package entitled Earthquake Loss Estimation Routine (ELER) was developed for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region. The work was carried out under the Joint Research Activity-3 (JRA3) of the EC FP6 project entitled Network of Research Infrastructures for European Seismology (NERIES). The ELER methodology anticipates: 1) finding of the most likely location of the source of the earthquake using regional seismo-tectonic data base; 2) estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion prediction models, bias-correcting the ground motion estimations with strong ground motion data, if available; 3) estimation of the spatial distribution of site-corrected ground motion parameters using regional geology database using appropriate amplification models; and 4) estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties). The multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships which are coded into ELER. The present paper provides brief information on the methodology of ELER and provides an example application with the recent major earthquake that hit the Van province in the east of Turkey on 23 October 2011 with moment magnitude (Mw) of 7.2. For this earthquake, Kandilli Observatory and Earthquake Research Institute (KOERI) provided almost real time estimations in terms of building damage and casualty distribution using ELER. (author)

  19. Rapid post-earthquake modelling of coseismic landslide intensity and distribution for emergency response decision support

    Directory of Open Access Journals (Sweden)

    T. R. Robinson

    2017-09-01

    Full Text Available Current methods to identify coseismic landslides immediately after an earthquake using optical imagery are too slow to effectively inform emergency response activities. Issues with cloud cover, data collection and processing, and manual landslide identification mean even the most rapid mapping exercises are often incomplete when the emergency response ends. In this study, we demonstrate how traditional empirical methods for modelling the total distribution and relative intensity (in terms of point density of coseismic landsliding can be successfully undertaken in the hours and days immediately after an earthquake, allowing the results to effectively inform stakeholders during the response. The method uses fuzzy logic in a GIS (Geographic Information Systems to quickly assess and identify the location-specific relationships between predisposing factors and landslide occurrence during the earthquake, based on small initial samples of identified landslides. We show that this approach can accurately model both the spatial pattern and the number density of landsliding from the event based on just several hundred mapped landslides, provided they have sufficiently wide spatial coverage, improving upon previous methods. This suggests that systematic high-fidelity mapping of landslides following an earthquake is not necessary for informing rapid modelling attempts. Instead, mapping should focus on rapid sampling from the entire affected area to generate results that can inform the modelling. This method is therefore suited to conditions in which imagery is affected by partial cloud cover or in which the total number of landslides is so large that mapping requires significant time to complete. The method therefore has the potential to provide a quick assessment of landslide hazard after an earthquake and may therefore inform emergency operations more effectively compared to current practice.

  20. Rapid and reversible recruitment of early visual cortex for touch.

    Directory of Open Access Journals (Sweden)

    Lotfi B Merabet

    2008-08-01

    Full Text Available The loss of vision has been associated with enhanced performance in non-visual tasks such as tactile discrimination and sound localization. Current evidence suggests that these functional gains are linked to the recruitment of the occipital visual cortex for non-visual processing, but the neurophysiological mechanisms underlying these crossmodal changes remain uncertain. One possible explanation is that visual deprivation is associated with an unmasking of non-visual input into visual cortex.We investigated the effect of sudden, complete and prolonged visual deprivation (five days in normally sighted adult individuals while they were immersed in an intensive tactile training program. Following the five-day period, blindfolded subjects performed better on a Braille character discrimination task. In the blindfold group, serial fMRI scans revealed an increase in BOLD signal within the occipital cortex in response to tactile stimulation after five days of complete visual deprivation. This increase in signal was no longer present 24 hours after blindfold removal. Finally, reversible disruption of occipital cortex function on the fifth day (by repetitive transcranial magnetic stimulation; rTMS impaired Braille character recognition ability in the blindfold group but not in non-blindfolded controls. This disruptive effect was no longer evident once the blindfold had been removed for 24 hours.Overall, our findings suggest that sudden and complete visual deprivation in normally sighted individuals can lead to profound, but rapidly reversible, neuroplastic changes by which the occipital cortex becomes engaged in processing of non-visual information. The speed and dynamic nature of the observed changes suggests that normally inhibited or masked functions in the sighted are revealed by visual loss. The unmasking of pre-existing connections and shifts in connectivity represent rapid, early plastic changes, which presumably can lead, if sustained and

  1. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    Science.gov (United States)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of

  2. Expanding the Delivery of Rapid Earthquake Information and Warnings for Response and Recovery

    Science.gov (United States)

    Blanpied, M. L.; McBride, S.; Hardebeck, J.; Michael, A. J.; van der Elst, N.

    2017-12-01

    Scientific organizations like the United States Geological Survey (USGS) release information to support effective responses during an earthquake crisis. Information is delivered to the White House, the National Command Center, the Departments of Defense, Homeland Security (including FEMA), Transportation, Energy, and Interior. Other crucial stakeholders include state officials and decision makers, emergency responders, numerous public and private infrastructure management centers (e.g., highways, railroads and pipelines), the media, and the public. To meet the diverse information requirements of these users, rapid earthquake notifications have been developed to be delivered by e-mail and text message, as well as a suite of earthquake information resources such as ShakeMaps, Did You Feel It?, PAGER impact estimates, and data are delivered via the web. The ShakeAlert earthquake early warning system being developed for the U.S. West Coast will identify and characterize an earthquake a few seconds after it begins, estimate the likely intensity of ground shaking, and deliver brief but critically important warnings to people and infrastructure in harm's way. Currently the USGS is also developing a capability to deliver Operational Earthquake Forecasts (OEF). These provide estimates of potential seismic behavior after large earthquakes and during evolving aftershock sequences. Similar work is underway in New Zealand, Japan, and Italy. In the development of OEF forecasts, social science research conducted during these sequences indicates that aftershock forecasts are valued for a variety of reasons, from informing critical response and recovery decisions to psychologically preparing for more earthquakes. New tools will allow users to customize map-based, spatiotemporal forecasts to their specific needs. Hazard curves and other advanced information will also be available. For such authoritative information to be understood and used during the pressures of an earthquake

  3. What are the visual features underlying rapid object recognition?

    Directory of Open Access Journals (Sweden)

    Sébastien M Crouzet

    2011-11-01

    Full Text Available Research progress in machine vision has been very significant in recent years. Robust face detection and identification algorithms are already readily available to consumers, and modern computer vision algorithms for generic object recognition are now coping with the richness and complexity of natural visual scenes. Unlike early vision models of object recognition that emphasized the role of figure-ground segmentation and spatial information between parts, recent successful approaches are based on the computation of loose collections of image features without prior segmentation or any explicit encoding of spatial relations. While these models remain simplistic models of visual processing, they suggest that, in principle, bottom-up activation of a loose collection of image features could support the rapid recognition of natural object categories and provide an initial coarse visual representation before more complex visual routines and attentional mechanisms take place. Focusing on biologically-plausible computational models of (bottom-up pre-attentive visual recognition, we review some of the key visual features that have been described in the literature. We discuss the consistency of these feature-based representations with classical theories from visual psychology and test their ability to account for human performance on a rapid object categorization task.

  4. The Benefits and Limitations of Crowdsourced Information for Rapid Damage Assessment of Global Earthquakes

    Science.gov (United States)

    Bossu, R.; Landès, M.; Roussel, F.

    2017-12-01

    The Internet has fastened the collection of felt reports and macroseismic data after global earthquakes. At the European-Mediterranean Seismological Centre (EMSC), where the traditional online questionnaires have been replace by thumbnail-based questionnaires, an average of half of the reports are collected within 10 minutes of an earthquake's occurrence. In regions where EMSC is well identified this goes down to 5 min. The user simply specifies the thumbnail corresponding to observed effects erasing languages barriers and improving collection via small smartphone screens. A previous study has shown that EMSC data is well correlated with "Did You Feel It" (DYFI) data and 3 independent, manually collected datasets. The efficiency and rapidity of felt report collection through thumbnail-based questionnaires does not necessarily mean that they offer a complete picture of the situation for all intensities values, especially the higher ones. There are several potential limitations. Demographics probably play a role but so might eyewitnesses' behaviors: it is probably not their priority to report when their own safety and that of their loved ones is at stake. We propose to test this hypothesis on EMSC felt reports and to extend the study to LastQuake smartphone application uses. LastQuake is a free smartphone app providing very rapid information on felt earthquakes. There are currently 210 000 active users around the world covering almost every country except for a few ones in Sub-Saharan Africa. Along with felt reports we also analyze the characteristics of LastQuake app launches. For both composite datasets created from 108 earthquakes, we analyze the rapidity of eyewitnesses' reaction and how it changes with intensity values and surmise how they reflect different types of behaviors. We will show the intrinsic limitations of crowdsourced information for rapid situation awareness. More importantly, we will show in which cases the lack of crowdsourced information could

  5. The Characteristics and Limits of Rapid Visual Categorization

    Science.gov (United States)

    Fabre-Thorpe, Michèle

    2011-01-01

    Visual categorization appears both effortless and virtually instantaneous. The study by Thorpe et al. (1996) was the first to estimate the processing time necessary to perform fast visual categorization of animals in briefly flashed (20 ms) natural photographs. They observed a large differential EEG activity between target and distracter correct trials that developed from 150 ms after stimulus onset, a value that was later shown to be even shorter in monkeys! With such strong processing time constraints, it was difficult to escape the conclusion that rapid visual categorization was relying on massively parallel, essentially feed-forward processing of visual information. Since 1996, we have conducted a large number of studies to determine the characteristics and limits of fast visual categorization. The present chapter will review some of the main results obtained. I will argue that rapid object categorizations in natural scenes can be done without focused attention and are most likely based on coarse and unconscious visual representations activated with the first available (magnocellular) visual information. Fast visual processing proved efficient for the categorization of large superordinate object or scene categories, but shows its limits when more detailed basic representations are required. The representations for basic objects (dogs, cars) or scenes (mountain or sea landscapes) need additional processing time to be activated. This finding is at odds with the widely accepted idea that such basic representations are at the entry level of the system. Interestingly, focused attention is still not required to perform these time consuming basic categorizations. Finally we will show that object and context processing can interact very early in an ascending wave of visual information processing. We will discuss how such data could result from our experience with a highly structured and predictable surrounding world that shaped neuronal visual selectivity. PMID

  6. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)

    Science.gov (United States)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.

    2010-12-01

    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  7. E-DECIDER Rapid Response to the M 6.0 South Napa Earthquake

    Science.gov (United States)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    E-DECIDER initiated rapid response mode when the California Earthquake Clearinghouse was activated the morning following the M6 Napa earthquake. Data products, including: 1) rapid damage and loss estimates, 2) deformation magnitude and slope change maps, and 3) aftershock forecasts were provided to the Clearinghouse partners within 24 hours of the event via XchangeCore Web Service Data Orchestration sharing. NASA data products were provided to end-users via XchangeCore, EERI and Clearinghouse websites, and ArcGIS online for Napa response, reaching a wide response audience. The E-DECIDER team helped facilitate rapid delivery of NASA products to stakeholders and participated in Clearinghouse Napa earthquake briefings to update stakeholders on product information. Rapid response products from E-DECIDER can be used to help prioritize response efforts shortly after the event has occurred. InLET (Internet Loss Estimation Tool) post-event damage and casualty estimates were generated quickly after the Napa earthquake. InLET provides immediate post-event estimates of casualties and building damage by performing loss/impact simulations using USGS ground motion data and FEMA HAZUS damage estimation technology. These results were provided to E-DECIDER by their collaborators, ImageCat, Inc. and the Community Stakeholder Network (CSN). Strain magnitude and slope change maps were automatically generated when the Napa earthquake appeared on the USGS feed. These maps provide an early estimate of where the deformation has occurred and where damage may be localized. Using E-DECIDER critical infrastructure overlays with damage estimates, decision makers can direct response effort that can be verified later with field reconnaissance and remote sensing-based observations. Earthquake aftershock forecast maps were produced within hours of the event. These maps highlight areas where aftershocks are likely to occur and can also be coupled with infrastructure overlays to help direct response

  8. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS)

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center

    2010-12-01

    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  9. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    Science.gov (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  10. SalanderMaps: A rapid overview about felt earthquakes through data mining of web-accesses

    Science.gov (United States)

    Kradolfer, Urs

    2013-04-01

    While seismological observatories detect and locate earthquakes based on measurements of the ground motion, they neither know a priori whether an earthquake has been felt by the public nor is it known, where it has been felt. Such information is usually gathered by evaluating feedback reported by the public through on-line forms on the web. However, after a felt earthquake in Switzerland, many people visit the webpages of the Swiss Seismological Service (SED) at the ETH Zurich and each such visit leaves traces in the logfiles on our web-servers. Data mining techniques, applied to these logfiles and mining publicly available data bases on the internet open possibilities to obtain previously unknown information about our virtual visitors. In order to provide precise information to authorities and the media, it would be desirable to rapidly know from which locations these web-accesses origin. The method 'Salander' (Seismic Activitiy Linked to Area codes - Nimble Detection of Earthquake Rumbles) will be introduced and it will be explained, how the IP-addresses (each computer or router directly connected to the internet has a unique IP-address; an example would be 129.132.53.5) of a sufficient amount of our virtual visitors were linked to their geographical area. This allows us to unprecedentedly quickly know whether and where an earthquake was felt in Switzerland. It will also be explained, why the method Salander is superior to commercial so-called geolocation products. The corresponding products of the Salander method, animated SalanderMaps, which are routinely generated after each earthquake with a magnitude of M>2 in Switzerland (http://www.seismo.ethz.ch/prod/salandermaps/, available after March 2013), demonstrate how the wavefield of earthquakes propagates through Switzerland and where it was felt. Often, such information is available within less than 60 seconds after origin time, and we always get a clear picture within already five minutes after origin time

  11. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    Science.gov (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  12. Thumbnail‐based questionnaires for the rapid and efficient collection of macroseismic data from global earthquakes

    Science.gov (United States)

    Bossu, Remy; Landes, Matthieu; Roussel, Frederic; Steed, Robert; Mazet-Roux, Gilles; Martin, Stacey S.; Hough, Susan E.

    2017-01-01

    The collection of earthquake testimonies (i.e., qualitative descriptions of felt shaking) is essential for macroseismic studies (i.e., studies gathering information on how strongly an earthquake was felt in different places), and when done rapidly and systematically, improves situational awareness and in turn can contribute to efficient emergency response. In this study, we present advances made in the collection of testimonies following earthquakes around the world using a thumbnail‐based questionnaire implemented on the European‐Mediterranean Seismological Centre (EMSC) smartphone app and its website compatible for mobile devices. In both instances, the questionnaire consists of a selection of thumbnails, each representing an intensity level of the European Macroseismic Scale 1998. We find that testimonies are collected faster, and in larger numbers, by way of thumbnail‐based questionnaires than by more traditional online questionnaires. Responses were received from all seismically active regions of our planet, suggesting that thumbnails overcome language barriers. We also observed that the app is not sufficient on its own, because the websites are the main source of testimonies when an earthquake strikes a region for the first time in a while; it is only for subsequent shocks that the app is widely used. Notably though, the speed of the collection of testimonies increases significantly when the app is used. We find that automated EMSC intensities as assigned by user‐specified thumbnails are, on average, well correlated with “Did You Feel It?” (DYFI) responses and with the three independently and manually derived macroseismic datasets, but there is a tendency for EMSC to be biased low with respect to DYFI at moderate and large intensities. We address this by proposing a simple adjustment that will be verified in future earthquakes.

  13. A new quantitative method for the rapid evaluation of buildings against earthquakes

    International Nuclear Information System (INIS)

    Mahmoodzadeh, Amir; Mazaheri, Mohammad Mehdi

    2008-01-01

    At the present time there exist numerous weak buildings which are not able to withstand earthquakes. At the same time, both private and public developers are trying to use scientific methods to prioritize and allocate budget in order to reinforce the above mentioned structures. This is because of the limited financial resources and time. In the recent years the procedure of seismic assessment before rehabilitation of vulnerable buildings has been implemented in many countries. Now, it seems logical to reinforce the existing procedures with the mass of available data about the effects caused by earthquakes on buildings. The main idea is driven from FMEA (Failure Mode and Effect Analysis) in quality management where the main procedure is to recognize the failure, the causes, and the priority of each cause and failure. Specifying the causes and effects which lead to a certain shortcoming in structural behavior during earthquakes, an inventory is developed and each building is rated through a yes-or-no procedure. In this way, the rating of the structure is based on some standard forms which along with relative weights are developed in this study. The resulted criteria by rapid assessment will indicate whether the structure is to be demolished, has a high, medium or low vulnerability or is invulnerable

  14. Twitter as Information Source for Rapid Damage Estimation after Major Earthquakes

    Science.gov (United States)

    Eggert, Silke; Fohringer, Joachim

    2014-05-01

    Natural disasters like earthquakes require a fast response from local authorities. Well trained rescue teams have to be available, equipment and technology has to be ready set up, information have to be directed to the right positions so the head quarter can manage the operation precisely. The main goal is to reach the most affected areas in a minimum of time. But even with the best preparation for these cases, there will always be the uncertainty of what really happened in the affected area. Modern geophysical sensor networks provide high quality data. These measurements, however, are only mapping disjoint values from their respective locations for a limited amount of parameters. Using observations of witnesses represents one approach to enhance measured values from sensors ("humans as sensors"). These observations are increasingly disseminated via social media platforms. These "social sensors" offer several advantages over common sensors, e.g. high mobility, high versatility of captured parameters as well as rapid distribution of information. Moreover, the amount of data offered by social media platforms is quite extensive. We analyze messages distributed via Twitter after major earthquakes to get rapid information on what eye-witnesses report from the epicentral area. We use this information to (a) quickly learn about damage and losses to support fast disaster response and to (b) densify geophysical networks in areas where there is sparse information to gain a more detailed insight on felt intensities. We present a case study from the Mw 7.1 Philippines (Bohol) earthquake that happened on Oct. 15 2013. We extract Twitter messages, so called tweets containing one or more specified keywords from the semantic field of "earthquake" and use them for further analysis. For the time frame of Oct. 15 to Oct 18 we get a data base of in total 50.000 tweets whereof 2900 tweets are geo-localized and 470 have a photo attached. Analyses for both national level and locally for

  15. An efficient rapid warning system for earthquakes in the European - Mediterranean region

    International Nuclear Information System (INIS)

    Mazet-Roux, G.; Bossu, R; Tome, M.; Giovambattista, R. Di

    2002-01-01

    Every year a few damaging earthquakes occur in the European-Mediterranean region. It is therefore indispensable to operate a real-time warning system in order to provide rapidly reliable estimates of the location, depth and magnitude of these seismic events. In order to provide this information in a timely manner both to the scientific community and to the European and national authorities dealing with natural hazards and relief organisation, the European-Mediterranean Seismological Centre (EMSC) has federated a network of seismic networks exchanging their data in quasi real-time. Today, thanks to the Internet, the EMSC receives real-time information about earthquakes from about thirty seismological institutes. As soon as data reach the EMSC, they are displayed on the EMSC Web pages (www.emsc-csem.org). A seismic alert is generated for any potentially damaging earthquake in the European-Mediterranean region and disseminated within one hour following its occurrence. Potentially damaging earthquakes are defined as seismic events of magnitude 5 or above in the European-Mediterranean region. The utility of this EMSC service is clearly demonstrated by its following among the public: EMSC e-mail dissemination list has been subscribed by about 300 institutions (ECHO, NGO, civil defence services, seismological institutes) or individuals and the rate of internet connections to EMSC web site dramatically increase following an alert. The aim of this presentation is to give a complete technical description of the EMSC warning system. We will also take this opportunity to thank each of the contributing institutions for their support and efforts to enhance the system performances. (authors)

  16. Rapid Source Characterization of the 2011 Mw 9.0 off the Pacific coast of Tohoku Earthquake

    Science.gov (United States)

    Hayes, Gavin P.

    2011-01-01

    On March 11th, 2011, a moment magnitude 9.0 earthquake struck off the coast of northeast Honshu, Japan, generating what may well turn out to be the most costly natural disaster ever. In the hours following the event, the U.S. Geological Survey National Earthquake Information Center led a rapid response to characterize the earthquake in terms of its location, size, faulting source, shaking and slip distributions, and population exposure, in order to place the disaster in a framework necessary for timely humanitarian response. As part of this effort, fast finite-fault inversions using globally distributed body- and surface-wave data were used to estimate the slip distribution of the earthquake rupture. Models generated within 7 hours of the earthquake origin time indicated that the event ruptured a fault up to 300 km long, roughly centered on the earthquake hypocenter, and involved peak slips of 20 m or more. Updates since this preliminary solution improve the details of this inversion solution and thus our understanding of the rupture process. However, significant observations such as the up-dip nature of rupture propagation and the along-strike length of faulting did not significantly change, demonstrating the usefulness of rapid source characterization for understanding the first order characteristics of major earthquakes.

  17. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    Science.gov (United States)

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul S.; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  18. Rapid estimation of the moment magnitude of the 2011 off the Pacific coast of Tohoku earthquake from coseismic strain steps

    Science.gov (United States)

    Itaba, S.; Matsumoto, N.; Kitagawa, Y.; Koizumi, N.

    2012-12-01

    The 2011 off the Pacific coast of Tohoku earthquake, of moment magnitude (Mw) 9.0, occurred at 14:46 Japan Standard Time (JST) on March 11, 2011. The coseismic strain steps caused by the fault slip of this earthquake were observed in the Tokai, Kii Peninsula and Shikoku by the borehole strainmeters which were carefully set by Geological Survey of Japan, AIST. Using these strain steps, we estimated a fault model for the earthquake on the boundary between the Pacific and North American plates. Our model, which is estimated only from several minutes' strain data, is largely consistent with the final fault models estimated from GPS and seismic wave data. The moment magnitude can be estimated about 6 minutes after the origin time, and 4 minutes after wave arrival. According to the fault model, the moment magnitude of the earthquake is 8.7. On the other hand, based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency announced just after earthquake occurrence was 7.9. Generally coseismic strain steps are considered to be less reliable than seismic waves and GPS data. However our results show that the coseismic strain steps observed by the borehole strainmeters, which were carefully set and monitored, can be relied enough to decide the earthquake magnitude precisely and rapidly. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami. Our simple method of using strain steps is one of the strong methods for rapid estimation of the magnitude of great earthquakes.

  19. The rapid terrain visualization interferometric synthetic aperture radar sensor

    Science.gov (United States)

    Graham, Robert H.; Bickel, Douglas L.; Hensley, William H.

    2003-11-01

    The Rapid Terrain Visualization interferometric synthetic aperture radar was designed and built at Sandia National Laboratories as part of an Advanced Concept Technology Demonstration (ACTD) to "demonstrate the technologies and infrastructure to meet the Army requirement for rapid generation of digital topographic data to support emerging crisis or contingencies." This sensor is currently being operated by Sandia National Laboratories for the Joint Precision Strike Demonstration (JPSD) Project Office to provide highly accurate digital elevation models (DEMs) for military and civilian customers, both inside and outside of the United States. The sensor achieves better than DTED Level IV position accuracy in near real-time. The system is being flown on a deHavilland DHC-7 Army aircraft. This paper outlines some of the technologies used in the design of the system, discusses the performance, and will discuss operational issues. In addition, we will show results from recent flight tests, including high accuracy maps taken of the San Diego area.

  20. Earthquake behavior of the Enriquillo fault zone, Haiti revealed by interactive terrain visualization

    Science.gov (United States)

    Cowgill, E.; Bernardin, T. S.; Oskin, M. E.; Bowles, C. J.; Yikilmaz, M. B.; Kreylos, O.; Elliott, A. J.; Bishop, M. S.; Gold, R. D.; Morelan, A.; Bawden, G. W.; Hamann, B.; Kellogg, L. H.

    2010-12-01

    The Mw 7.0 January 12, 2010 Haiti earthquake ended 240 years of relative quiescence following earthquakes that destroyed Port-au-Prince in 1751 and 1770. We place the 2010 rupture in the context of past earthquakes and future hazards by using remote analysis of airborne LiDAR to observe the topographic expression of active faulting and develop a new conceptual model for the earthquake behavior of the eastern Enriquillo fault zone (EFZ). In this model, the 2010 event occupies a long-lived segment boundary at a stepover within the EFZ separating fault segments that likely ruptured in 1751 and 1770, explaining both past clustering and the lack of 2010 surface rupture. Immediately following the 2010 earthquake, an airborne LiDAR point cloud containing over 2.7 billion point measurements of surface features was collected by the Rochester Inst. of Technology. To analyze these data, we capitalize on the human capacity to visually identify meaningful patterns embedded in noisy data by conducting interactive visual analysis of the entire 66.8 GB Haiti terrain data in a 4-sided, 800 ft3 immersive virtual-reality environment at the UC Davis KeckCAVES using the software tools LiDAR Viewer (to analyze point cloud data) and Crusta (for 3D surficial geologic mapping on DEM data). We discovered and measured landforms displaced by past surface-rupturing earthquakes and remotely characterized the regional fault geometry. Our analysis of the ~50 km long reach of EFZ spanning the 2010 epicenter indicates that geomorphic evidence of active faulting is clearer east of the epicenter than to the west. West of the epicenter, and in the region of the 2010 rupture, the fault is poorly defined along an embayed, low-relief range front, with little evidence of recent surface rupture. In contrast, landform offsets of 6 to 50 m along the reach of the EFZ east of the epicenter and closest to Port-au-Prince attest to repeated recent surface-rupturing earthquakes here. Specifically, we found and

  1. Rapid discrimination of visual scene content in the human brain

    Science.gov (United States)

    Anokhin, Andrey P.; Golosheykin, Simon; Sirevaag, Erik; Kristjansson, Sean; Rohrbaugh, John W.; Heath, Andrew C.

    2007-01-01

    The rapid evaluation of complex visual environments is critical for an organism's adaptation and survival. Previous studies have shown that emotionally significant visual scenes, both pleasant and unpleasant, elicit a larger late positive wave in the event-related brain potential (ERP) than emotionally neutral pictures. The purpose of the present study was to examine whether neuroelectric responses elicited by complex pictures discriminate between specific, biologically relevant contents of the visual scene and to determine how early in the picture processing this discrimination occurs. Subjects (n=264) viewed 55 color slides differing in both scene content and emotional significance. No categorical judgments or responses were required. Consistent with previous studies, we found that emotionally arousing pictures, regardless of their content, produce a larger late positive wave than neutral pictures. However, when pictures were further categorized by content, anterior ERP components in a time window between 200−600 ms following stimulus onset showed a high selectivity for pictures with erotic content compared to other pictures regardless of their emotional valence (pleasant, neutral, and unpleasant) or emotional arousal. The divergence of ERPs elicited by erotic and non-erotic contents started at 185 ms post-stimulus in the fronto-central midline regions, with a later onset in parietal regions. This rapid, selective, and content-specific processing of erotic materials and its dissociation from other pictures (including emotionally positive pictures) suggests the existence of a specialized neural network for prioritized processing of a distinct category of biologically relevant stimuli with high adaptive and evolutionary significance. PMID:16712815

  2. Letter to the Editor : Rapidly-deployed small tent hospitals: lessons from the earthquake in Haiti.

    Energy Technology Data Exchange (ETDEWEB)

    Rosen, Y.; Gurman , P.; Verna, E.; Elman , N.; Labor, E. (Materials Science Division); (Superior NanoBioSystems LLC); (Fast Israeli Rescue & Search Team); (Clinique Adonai); (Mass. Inst. Tech.); (Univ. Haifa)

    2012-06-01

    The damage to medical facilities resulting form the January 2010 earthquake in haiti necessitated the establishment of field tent hospitals. Much of the local medical infrastructure was destroyed or limited operationally when the Fast Israel Rescue and Search Team (FIRST) arrived in Haiti shortly after the January 2010 earthquake. The FIRST deployed small tent hospitals in Port-au-Prince and in 11 remote areas outside of the city. Each tent was set up in less than a half hour. The tents were staffed with an orthopedic surgeon, gynecologists, primary care and emergency care physicians, a physician with previous experience in tropical medicine, nurses, paramedics, medics, and psychologists. The rapidly deployable and temporary nature of the effort allowed the team to treat and educate, as well as provide supplies for, thousands of refugees throughout Haiti. In addition, a local Haitian physician and his team created a small tent hospital to serve the Petion Refugee Camp and its environs. FIRST personnel also took shifts at this hospital.

  3. Rapid Deterioration of Latent HBV Hepatitis during Cushing Disease and Posttraumatic Stress Disorder after Earthquake.

    Science.gov (United States)

    Tashiro, Ryosuke; Ogawa, Yoshikazu; Tominaga, Teiji

    2017-07-01

    Reactivation of the hepatitis B virus (HBV) is a risk in the 350 million HBV carriers worldwide. HBV reactivation may cause hepatocellular carcinoma, cirrhosis, and fulminant hepatitis, and HBV reactivation accompanied with malignant tumor and/or chemotherapy is a critical problem for patients with chronic HBV infection. Multiple risk factors causing an immunosuppressive state can also induce HBV reactivation.We present a case of HBV reactivation during an immunosuppressive state caused by Cushing disease and physical and psychological stress after a disaster. A 47-year-old Japanese woman was an inactive HBV carrier until the Great East Japan Earthquake occurred and follow-up was discontinued. One year after the earthquake she had intractable hypertension, and her visual acuity gradually worsened. Head magnetic resonance imaging showed a sellar tumor compressing the optic chiasm, and hepatic dysfunction with HBV reactivation was identified. Endocrinologic examination established the diagnosis as Cushing disease. After normalization of hepatic dysfunction with antiviral therapy, transsphenoidal tumor removal was performed that resulted in subtotal removal except the right cavernous portion. Steroid hormone supplementation was discontinued after 3 days of administration, and gamma knife therapy was performed for the residual tumor. Eighteen months after the operation, adrenocorticotropic hormone and cortisol values returned to normal. The patient has been free from tumor regrowth and HBV reactivation throughout the postoperative course.Accomplishment of normalization with intrinsic steroid value with minimization of steroid supplementation should be established. Precise operative procedures and careful treatment planning are essential to avoid HBV reactivation in patients with this threatening condition. Georg Thieme Verlag KG Stuttgart · New York.

  4. Rapid visual and spectrophotometric nitrite detection by cyclometalated ruthenium complex.

    Science.gov (United States)

    Lo, Hoi-Shing; Lo, Ka-Wai; Yeung, Chi-Fung; Wong, Chun-Yuen

    2017-10-16

    Quantitative determination of nitrite ion (NO 2 - ) is of great importance in environmental and clinical investigations. A rapid visual and spectrophotometric assay for NO 2 - detection was developed based on a newly designed ruthenium complex, [Ru(npy)([9]aneS3)(CO)](ClO 4 ) (denoted as RuNPY; npy = 2-(1-naphthyl)pyridine, [9]aneS3 = 1,4,7-trithiacyclononane). This complex traps NO + produced in acidified NO 2 - solution, and yields observable color change within 1 min at room temperature. The assay features excellent dynamic range (1-840 μmol L -1 ) and high selectivity, and its limit of detection (0.39 μmol L -1 ) is also well below the guideline values for drinking water recommended by WHO and U.S. EPA. Practical use of this assay in tap water and human urine was successfully demonstrated. Overall, the rapidity and selectivity of this assay overcome the problems suffered by the commonly used modified Griess assays for nitrite determination. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Fast 3D seismic wave simulations of 24 August 2016 Mw 6.0 central Italy earthquake for visual communication

    Directory of Open Access Journals (Sweden)

    Emanuele Casarotti

    2016-12-01

    Full Text Available We present here the first application of the fast reacting framework for 3D simulations of seismic wave propagation generated by earthquakes in the Italian region with magnitude Mw 5. The driven motivation is to offer a visualization of the natural phenomenon to the general public but also to provide preliminary modeling to expert and civil protection operators. We report here a description of this framework during the emergency of 24 August 2016 Mw 6.0 central Italy Earthquake, a discussion on the accuracy of the simulation for this seismic event and a preliminary critical analysis of the visualization structure and of the reaction of the public.

  6. Real-Time GPS Monitoring for Earthquake Rapid Assessment in the San Francisco Bay Area

    Science.gov (United States)

    Guillemot, C.; Langbein, J. O.; Murray, J. R.

    2012-12-01

    The U.S. Geological Survey Earthquake Science Center has deployed a network of eight real-time Global Positioning System (GPS) stations in the San Francisco Bay area and is implementing software applications to continuously evaluate the status of the deformation within the network. Real-time monitoring of the station positions is expected to provide valuable information for rapidly estimating source parameters should a large earthquake occur in the San Francisco Bay area. Because earthquake response applications require robust data access, as a first step we have developed a suite of web-based applications which are now routinely used to monitor the network's operational status and data streaming performance. The web tools provide continuously updated displays of important telemetry parameters such as data latency and receive rates, as well as source voltage and temperature information within each instrument enclosure. Automated software on the backend uses the streaming performance data to mitigate the impact of outages, radio interference and bandwidth congestion on deformation monitoring operations. A separate set of software applications manages the recovery of lost data due to faulty communication links. Displacement estimates are computed in real-time for various combinations of USGS, Plate Boundary Observatory (PBO) and Bay Area Regional Deformation (BARD) network stations. We are currently comparing results from two software packages (one commercial and one open-source) used to process 1-Hz data on the fly and produce estimates of differential positions. The continuous monitoring of telemetry makes it possible to tune the network to minimize the impact of transient interruptions of the data flow, from one or more stations, on the estimated positions. Ongoing work is focused on using data streaming performance history to optimize the quality of the position, reduce drift and outliers by switching to the best set of stations within the network, and

  7. SeismoDome: Sonic and visual representation of earthquakes and seismic waves in the planetarium

    Science.gov (United States)

    Holtzman, B. K.; Candler, J.; Repetto, D.; Pratt, M. J.; Paté, A.; Turk, M.; Gualtieri, L.; Peter, D. B.; Trakinski, V.; Ebel, D. S. S.; Gossmann, J.; Lem, N.

    2017-12-01

    Since 2014, we have produced four "Seismodome" public programs in the Hayden Planetarium at the American Museum of Natural History in New York City. To teach the general public about the dynamics of the Earth, we use a range of seismic data (seismicity catalogs, surface and body wave fields, ambient noise, free oscillations) to generate movies and sounds conveying aspects of the physics of earthquakes and seismic waves. The narrative aims to stretch people's sense of time and scale, starting with 2 billion years of convection, then zooming in seismicity over days to twenty years at different length scales, to hours of global seismic wave propagation, all compressed to minute long movies. To optimize the experience in the planetarium, the 180-degree fisheye screen corresponds directly to the surface of the Earth, such that the audience is inside the planet. The program consists of three main elements (1) Using sonified and animated seismicity catalogs, comparison of several years of earthquakes on different plate boundaries conveys the dramatic differences in their dynamics and the nature of great and "normal" earthquakes. (2) Animations of USArray data (based on "Ground Motion Visualizations" methods from IRIS but in 3D, with added sound) convey the basic observations of seismic wave fields, with which we raise questions about what they tell us about earthquake physics and the Earth's interior structure. (3) Movies of spectral element simulations of global seismic wave fields synchronized with sonified natural data push these questions further, especially when viewed from the interior of the planet. Other elements include (4) sounds of the global ambient noise field coupled to movies of mean ocean wave height (related to the noise source) and (5) three months of free oscillations / normal modes ringing after the Tohoku earthquake. We use and develop a wide range of sonification and animation methods, written mostly in python. Flat-screen versions of these movies

  8. Light Video Game Play is Associated with Enhanced Visual Processing of Rapid Serial Visual Presentation Targets.

    Science.gov (United States)

    Howard, Christina J; Wilding, Robert; Guest, Duncan

    2017-02-01

    There is mixed evidence that video game players (VGPs) may demonstrate better performance in perceptual and attentional tasks than non-VGPs (NVGPs). The rapid serial visual presentation task is one such case, where observers respond to two successive targets embedded within a stream of serially presented items. We tested light VGPs (LVGPs) and NVGPs on this task. LVGPs were better at correct identification of second targets whether they were also attempting to respond to the first target. This performance benefit seen for LVGPs suggests enhanced visual processing for briefly presented stimuli even with only very moderate game play. Observers were less accurate at discriminating the orientation of a second target within the stream if it occurred shortly after presentation of the first target, that is to say, they were subject to the attentional blink (AB). We find no evidence for any reduction in AB in LVGPs compared with NVGPs.

  9. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  10. Earthquake Magnitude and Shaking Intensity Dependent Fragility Functions for Rapid Risk Assessment of Buildings

    Directory of Open Access Journals (Sweden)

    Marie-José Nollet

    2018-01-01

    Full Text Available An integrated web application, referred to as ER2 for rapid risk evaluator, is under development for a user-friendly seismic risk assessment by the non-expert public safety community. The assessment of likely negative consequences is based on pre-populated databases of seismic, building inventory and vulnerability parameters. To further accelerate the computation for near real-time analyses, implicit building fragility curves were developed as functions of the magnitude and the intensity of the seismic shaking defined with a single intensity measure, input spectral acceleration at 1.0 s implicitly considering the epicentral distance and local soil conditions. Damage probabilities were compared with those obtained with the standard fragility functions explicitly considering epicentral distances and local site classes in addition to the earthquake magnitudes and respective intensity of the seismic shaking. Different seismic scenarios were considered first for 53 building classes common in Eastern Canada, and then a reduced number of 24 combined building classes was proposed. Comparison of results indicate that the damage predictions with implicit fragility functions for short (M ≤ 5.5 and medium strong motion duration (5.5 < M ≤ 7.5 show low variation with distance and soil class, with average error of less than 3.6%.

  11. Crowdsourcing Rapid Assessment of Collapsed Buildings Early after the Earthquake Based on Aerial Remote Sensing Image: A Case Study of Yushu Earthquake

    Directory of Open Access Journals (Sweden)

    Shuai Xie

    2016-09-01

    Full Text Available Remote sensing (RS images play a significant role in disaster emergency response. Web2.0 changes the way data are created, making it possible for the public to participate in scientific issues. In this paper, an experiment is designed to evaluate the reliability of crowdsourcing buildings collapse assessment in the early time after an earthquake based on aerial remote sensing image. The procedure of RS data pre-processing and crowdsourcing data collection is presented. A probabilistic model including maximum likelihood estimation (MLE, Bayes’ theorem and expectation-maximization (EM algorithm are applied to quantitatively estimate the individual error-rate and “ground truth” according to multiple participants’ assessment results. An experimental area of Yushu earthquake is provided to present the results contributed by participants. Following the results, some discussion is provided regarding accuracy and variation among participants. The features of buildings labeled as the same damage type are found highly consistent. This suggests that the building damage assessment contributed by crowdsourcing can be treated as reliable samples. This study shows potential for a rapid building collapse assessment through crowdsourcing and quantitatively inferring “ground truth” according to crowdsourcing data in the early time after the earthquake based on aerial remote sensing image.

  12. How citizen seismology is transforming rapid public earthquake information: the example of LastQuake smartphone application and Twitter QuakeBot

    Science.gov (United States)

    Bossu, R.; Etivant, C.; Roussel, F.; Mazet-Roux, G.; Steed, R.

    2014-12-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.

  13. Rupture complexity of the Mw 8.3 sea of okhotsk earthquake: Rapid triggering of complementary earthquakes?

    Science.gov (United States)

    Wei, Shengji; Helmberger, Don; Zhan, Zhongwen; Graves, Robert

    2013-01-01

    We derive a finite slip model for the 2013 Mw 8.3 Sea of Okhotsk Earthquake (Z = 610 km) by inverting calibrated teleseismic P waveforms. The inversion shows that the earthquake ruptured on a 10° dipping rectangular fault zone (140 km × 50 km) and evolved into a sequence of four large sub-events (E1–E4) with an average rupture speed of 4.0 km/s. The rupture process can be divided into two main stages. The first propagated south, rupturing sub-events E1, E2, and E4. The second stage (E3) originated near E2 with a delay of 12 s and ruptured northward, filling the slip gap between E1 and E2. This kinematic process produces an overall slip pattern similar to that observed in shallow swarms, except it occurs over a compressed time span of about 30 s and without many aftershocks, suggesting that sub-event triggering for deep events is significantly more efficient than for shallow events.

  14. Some rapid and long traveled landslides triggered by the May 12, 2008 Sichuan earthquake

    Science.gov (United States)

    Wang, G.; Kamai, T.; Chigira, M.; Wu, X. Y.; Zhang, D. X.

    2009-04-01

    On May 12, 2008, a 7.9M earthquake struck Sichuan province of China, causing a huge number of death and injuries, and great loss of properties, becoming the most damaging earthquake since the 1976 Tangshan earthquake, in China. The collapse of buildings during the earthquake is the main reason for the casualties. There are a huge number of landslides that had been triggered by this earthquake. Almost all the roads to the mountainous areas had been blocked and many dams were formed by the displaced landslide materials, resulting in great difficulties for the aftershock rescue activities. Also a big portion of the casualties was directly caused by the landslides. The authors had reconnaissance field trips of the landslides, and performed preliminary investigation on some of the catastrophic ones. In this report, four landslides, i.e., Xiejiadian landslide in Pengzhou city, Donghekou landslide and Magongxiang landslide in Qingchuan County, and Niujuangou landslide on the epicenter area of Yingxiu Town, are introduced. The characteristics of deposited landslide masses in Donghekou landslide were investigated by means of a multichannel surface wave technique. Two earthquake recorders were installed at the upper part and deposit area of Donghekou landslide. The seismic responses of different parts of the landslides were monitored, and recorded successfully during the aftershocks that occurred in Qingchuan County on July 24, 2008. Also the drained and undrained dynamic shear behaviors of samples from the landslide areas were examined. Some preliminary analyzing results will be presented in this report.

  15. Rapid Response Products of The ARIA Project for the M6.0 August 24, 2014 South Napa Earthquake

    Science.gov (United States)

    Yun, S. H.; Owen, S. E.; Hua, H.; Milillo, P.; Fielding, E. J.; Hudnut, K. W.; Dawson, T. E.; Mccrink, T. P.; Jo, M. J.; Barnhart, W. D.; Manipon, G. J. M.; Agram, P. S.; Moore, A. W.; Jung, H. S.; Webb, F.; Milillo, G.; Rosinski, A.

    2014-12-01

    A magnitude 6.0 earthquake struck southern Napa county northeast of San Francisco, California, on Aug. 24, 2014, causing significant damage in the city of Napa and nearby areas. One day after the earthquake, the Advanced Rapid Imaging and Analysis (ARIA) team produced and released observations of coseismic ground displacement measured with continuous GPS stations of the Plate Boundary Observatory (operated by UNAVCO for the National Science Foundation) and the Bay Area Rapid Deformation network (operated by Berkeley Seismological Laboratory). Three days after the earthquake (Aug. 27), the Italian Space Agency's (ASI) COSMO-SkyMed (CSK) satellite acquired their first post-event data. On the same day, the ARIA team, in collaboration with ASI and University of Basilicata, produced and released a coseismic interferogram that revealed ground deformation and surface rupture. The depiction of the surface rupture - discontinuities of color fringes in the CSK interferogram - helped guide field geologists from the US Geological Survey and the California Geological Survey (CGS) to features that may have otherwise gone undetected. Small-scale cracks were found on a runway of the Napa County Airport, as well as bridge damage and damaged roads. ARIA's response to this event highlighted the importance of timeliness for mapping surface deformation features. ARIA's rapid response products were shared through Southern California Earthquake Center's response website and the California Earthquake Clearinghouse. A damage proxy map derived from InSAR coherence of CSK data was produced and distributed on Aug. 27. Field crews from the CGS identified true and false positives, including mobile home damage, newly planted grape vines, and a cripple wall failure of a house. Finite fault slip models constrained from CSK interferograms and continuous GPS observations reveal a north-propagating rupture with well-resolved slip from 0-10.5 km depth. We also measured along-track coseismic

  16. Rapid influences of cued visual memories on attentional guidance

    NARCIS (Netherlands)

    van Moorselaar, D.; Battistoni, E.; Theeuwes, J.; Olivers, C.N.L.

    2015-01-01

    There is evidence that the deployment of attention can be biased by the content of visual working memory. Recently, it has been shown that focusing internal attention to a specific item in memory not only increases the accessibility of that specific item for retrieval, but also results in increased

  17. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Science.gov (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  18. Rapid extraction of lexical tone phonology in Chinese characters: a visual mismatch negativity study.

    Directory of Open Access Journals (Sweden)

    Xiao-Dong Wang

    Full Text Available BACKGROUND: In alphabetic languages, emerging evidence from behavioral and neuroimaging studies shows the rapid and automatic activation of phonological information in visual word recognition. In the mapping from orthography to phonology, unlike most alphabetic languages in which there is a natural correspondence between the visual and phonological forms, in logographic Chinese, the mapping between visual and phonological forms is rather arbitrary and depends on learning and experience. The issue of whether the phonological information is rapidly and automatically extracted in Chinese characters by the brain has not yet been thoroughly addressed. METHODOLOGY/PRINCIPAL FINDINGS: We continuously presented Chinese characters differing in orthography and meaning to adult native Mandarin Chinese speakers to construct a constant varying visual stream. In the stream, most stimuli were homophones of Chinese characters: The phonological features embedded in these visual characters were the same, including consonants, vowels and the lexical tone. Occasionally, the rule of phonology was randomly violated by characters whose phonological features differed in the lexical tone. CONCLUSIONS/SIGNIFICANCE: We showed that the violation of the lexical tone phonology evoked an early, robust visual response, as revealed by whole-head electrical recordings of the visual mismatch negativity (vMMN, indicating the rapid extraction of phonological information embedded in Chinese characters. Source analysis revealed that the vMMN was involved in neural activations of the visual cortex, suggesting that the visual sensory memory is sensitive to phonological information embedded in visual words at an early processing stage.

  19. Rapid extraction of lexical tone phonology in Chinese characters: a visual mismatch negativity study.

    Science.gov (United States)

    Wang, Xiao-Dong; Liu, A-Ping; Wu, Yin-Yuan; Wang, Peng

    2013-01-01

    In alphabetic languages, emerging evidence from behavioral and neuroimaging studies shows the rapid and automatic activation of phonological information in visual word recognition. In the mapping from orthography to phonology, unlike most alphabetic languages in which there is a natural correspondence between the visual and phonological forms, in logographic Chinese, the mapping between visual and phonological forms is rather arbitrary and depends on learning and experience. The issue of whether the phonological information is rapidly and automatically extracted in Chinese characters by the brain has not yet been thoroughly addressed. We continuously presented Chinese characters differing in orthography and meaning to adult native Mandarin Chinese speakers to construct a constant varying visual stream. In the stream, most stimuli were homophones of Chinese characters: The phonological features embedded in these visual characters were the same, including consonants, vowels and the lexical tone. Occasionally, the rule of phonology was randomly violated by characters whose phonological features differed in the lexical tone. We showed that the violation of the lexical tone phonology evoked an early, robust visual response, as revealed by whole-head electrical recordings of the visual mismatch negativity (vMMN), indicating the rapid extraction of phonological information embedded in Chinese characters. Source analysis revealed that the vMMN was involved in neural activations of the visual cortex, suggesting that the visual sensory memory is sensitive to phonological information embedded in visual words at an early processing stage.

  20. Rapid resumption of interrupted visual search. New insights on the interaction between vision and memory.

    Science.gov (United States)

    Lleras, Alejandro; Rensink, Ronald A; Enns, James T

    2005-09-01

    A modified visual search task demonstrates that humans are very good at resuming a search after it has been momentarily interrupted. This is shown by exceptionally rapid response time to a display that reappears after a brief interruption, even when an entirely different visual display is seen during the interruption and two different visual searches are performed simultaneously. This rapid resumption depends on the stability of the visual scene and is not due to display or response anticipations. These results are consistent with the existence of an iterative hypothesis-testing mechanism that compares information stored in short-term memory (the perceptual hypothesis) with information about the display (the sensory pattern). In this view, rapid resumption occurs because a hypothesis based on a previous glance of the scene can be tested very rapidly in a subsequent glance, given that the initial hypothesis-generation step has already been performed.

  1. Visual Temporal Logic as a Rapid Prototying Tool

    DEFF Research Database (Denmark)

    Fränzle, Martin; Lüth, Karsten

    2001-01-01

    Within this survey article, we explain real-time symbolic timing diagrams and the ICOS tool-box supporting timing-diagram-based requirements capture and rapid prototyping. Real-time symbolic timing diagrams are a full-fledged metric-time temporal logic, but with a graphical syntax reminiscent...... of the informal timing diagrams widely used in electrical engineering. ICOS integrates a variety of tools, ranging from graphical specification editors over tautology checking and counterexample generation to code generators emitting C or VHDL, thus bridging the gap from formal specification to rapid prototype...

  2. A Flexible Visualization Tool for Rapid Access to EFIT Results

    International Nuclear Information System (INIS)

    Zhang Ruirui; Xiao Bingjia; Luo Zhengping

    2014-01-01

    This paper introduces the design and implementation of an interactive tool, the EASTViewer, for the visualization of plasma equilibrium reconstruction results for EAST (the Experimental Advanced Superconducting Tokamak). Aimed at the operating system independently, Python, when combined with the PyGTK toolkit, is used as the programming language. Using modular design, the EASTViewer provides a unified interface with great flexibility. It is easy to access numerous data sources either from local data files or an MDSplus tree, and with the pre-defined configuration files, it can be extended to other tokamaks. The EASTViewer has been used as the major tool to visualize equilibrium data since the second EAST campaign in 2008, and it has been verified that the EASTViewer features a user-friendly interface, and has easy access to numerous data sources and cross-platforms. (fusion engineering)

  3. Rapid GNSS and Data Communication System Deployments In Chile and Argentina Following the M8.8 Maule Earthquake

    Science.gov (United States)

    Blume, F.; Meertens, C. M.; Brooks, B. A.; Bevis, M. G.; Smalley, R.; Parra, H.; Baez, J.

    2010-12-01

    Because the signal is so big, great earthquakes allow us to make quantum leaps in our understanding of Earth deformation process and material properties. The Maule earthquake, with its occurrence near a large subaerial landmass and the large numbers of instruments available to study it, will surely become one of the most important geophysical events in modern memory. Much of the important signal, however, decays and changes rapidly in the short-term following the event and so a rapid response is necessary. Actually delivering the data from the CGPS response stations, however, represents an intellectual challenge in terms of properly matching the engineering realities with the scientific desiderata. We expect multiple major science advances to come from these data: (1) Understanding earthquake and tsunami-genesis via use of the coseismic displacement field to create the most well-constrained fault slip and tsunami-genesis models. (2) The role of stress loading on both the principal thrust plane and subsidiary planes. (3) The relationship between fault afterslip to the main event as well as to the distribution of aftershocks (4) Study of large aftershocks jointly using conventional seismology and high-rate GPS coseismic displacement seismogram. (5) Rheological behavior of the fault interface. (6) The mechanical response of the bulk earth to large stress perturbations. Within 10 days of the earthquake 20 complete GPS systems were delivered by UNAVCO personnel to IGM and OSU staff in Santiago, and 5 were shipped via diplomatic pouch to Argentina. Consisting of of 10 Trimble NetRS and 15 Topcon GB-1000 receivers, the units were deployed througout the affected area during the following three weeks, using welded-in-place steel tripod monuments driven into soil or drilled into bedrock, or steel masts. Additional GPS hardware was procured from cooperating institutions and donated by GPS manufacturers, and a total of 43 post-earthquake GPS stations are continuously operating

  4. Effects of Temporal Congruity Between Auditory and Visual Stimuli Using Rapid Audio-Visual Serial Presentation.

    Science.gov (United States)

    An, Xingwei; Tang, Jiabei; Liu, Shuang; He, Feng; Qi, Hongzhi; Wan, Baikun; Ming, Dong

    2016-10-01

    Combining visual and auditory stimuli in event-related potential (ERP)-based spellers gained more attention in recent years. Few of these studies notice the difference of ERP components and system efficiency caused by the shifting of visual and auditory onset. Here, we aim to study the effect of temporal congruity of auditory and visual stimuli onset on bimodal brain-computer interface (BCI) speller. We designed five visual and auditory combined paradigms with different visual-to-auditory delays (-33 to +100 ms). Eleven participants attended in this study. ERPs were acquired and aligned according to visual and auditory stimuli onset, respectively. ERPs of Fz, Cz, and PO7 channels were studied through the statistical analysis of different conditions both from visual-aligned ERPs and audio-aligned ERPs. Based on the visual-aligned ERPs, classification accuracy was also analyzed to seek the effects of visual-to-auditory delays. The latencies of ERP components depended mainly on the visual stimuli onset. Auditory stimuli onsets influenced mainly on early component accuracies, whereas visual stimuli onset determined later component accuracies. The latter, however, played a dominate role in overall classification. This study is important for further studies to achieve better explanations and ultimately determine the way to optimize the bimodal BCI application.

  5. Rapid Retinal Release from a Cone Visual Pigment Following Photoactivation*

    Science.gov (United States)

    Chen, Min-Hsuan; Kuemmel, Colleen; Birge, Robert R.; Knox, Barry E.

    2012-01-01

    As part of the visual cycle, the retinal chromophore in both rod and cone visual pigments undergoes reversible Schiff base hydrolysis and dissociation following photobleaching. We characterized light-activated retinal release from a short-wavelength sensitive cone pigment (VCOP) in 0.1% dodecyl maltoside using fluorescence spectroscopy. The half-time (t1/2) of retinal release from VCOP was 7.1 s, 250-fold faster than rhodopsin. VCOP exhibited pH-dependent release kinetics, with the t1/2 decreasing from 23 s to 4 s with pH 4.1 to 8, respectively. However, the Arrhenius activation energy (Ea) for VCOP derived from kinetic measurements between 4° and 20°C was 17.4 kcal/mol, similar to 18.5 kcal/mol for rhodopsin. There was a small kinetic isotope (D2O) effect in VCOP, but less than that observed in rhodopsin. Mutation of the primary Schiff base counterion (VCOPD108A) produced a pigment with an unprotonated chromophore (⌊max = 360 nm) and dramatically slowed (t1/2 ~ 6.8 min) light-dependent retinal release. Using homology modeling, a VCOP mutant with two substitutions (S85D/ D108A) was designed to move the counterion one alpha helical turn into the transmembrane region from the native position. This double mutant had a UV-visible absorption spectrum consistent with a protonated Schiff base (⌊max = 420 nm). Moreover, VCOPS85D/D108A mutant had retinal release kinetics (t1/2 = 7 s) and Ea (18 kcal/mol) similar to the native pigment exhibiting no pH-dependence. By contrast, the single mutant VCOPS85D had a ~3-fold decrease in retinal release rate compared to the native pigment. Photoactivated VCOPD108A had kinetics comparable to a rhodopsin counterion mutant, RhoE113Q, both requiring hydroxylamine to fully release retinal. These results demonstrate that the primary counterion of cone visual pigments is necessary for efficient Schiff base hydrolysis. We discuss how the large differences in retinal release rates between rod and cone visual pigments arise, not from

  6. Rapid modulation of spoken word recognition by visual primes.

    Science.gov (United States)

    Okano, Kana; Grainger, Jonathan; Holcomb, Phillip J

    2016-02-01

    In a masked cross-modal priming experiment with ERP recordings, spoken Japanese words were primed with words written in one of the two syllabary scripts of Japanese. An early priming effect, peaking at around 200ms after onset of the spoken word target, was seen in left lateral electrode sites for Katakana primes, and later effects were seen for both Hiragana and Katakana primes on the N400 ERP component. The early effect is thought to reflect the efficiency with which words in Katakana script make contact with sublexical phonological representations involved in spoken language comprehension, due to the particular way this script is used by Japanese readers. This demonstrates fast-acting influences of visual primes on the processing of auditory target words, and suggests that briefly presented visual primes can influence sublexical processing of auditory target words. The later N400 priming effects, on the other hand, most likely reflect cross-modal influences on activity at the level of whole-word phonology and semantics.

  7. A review of visual perception mechanisms that regulate rapid adaptive camouflage in cuttlefish.

    Science.gov (United States)

    Chiao, Chuan-Chin; Chubb, Charles; Hanlon, Roger T

    2015-09-01

    We review recent research on the visual mechanisms of rapid adaptive camouflage in cuttlefish. These neurophysiologically complex marine invertebrates can camouflage themselves against almost any background, yet their ability to quickly (0.5-2 s) alter their body patterns on different visual backgrounds poses a vexing challenge: how to pick the correct body pattern amongst their repertoire. The ability of cuttlefish to change appropriately requires a visual system that can rapidly assess complex visual scenes and produce the motor responses-the neurally controlled body patterns-that achieve camouflage. Using specifically designed visual backgrounds and assessing the corresponding body patterns quantitatively, we and others have uncovered several aspects of scene variation that are important in regulating cuttlefish patterning responses. These include spatial scale of background pattern, background intensity, background contrast, object edge properties, object contrast polarity, object depth, and the presence of 3D objects. Moreover, arm postures and skin papillae are also regulated visually for additional aspects of concealment. By integrating these visual cues, cuttlefish are able to rapidly select appropriate body patterns for concealment throughout diverse natural environments. This sensorimotor approach of studying cuttlefish camouflage thus provides unique insights into the mechanisms of visual perception in an invertebrate image-forming eye.

  8. Chess players' eye movements reveal rapid recognition of complex visual patterns: Evidence from a chess-related visual search task.

    Science.gov (United States)

    Sheridan, Heather; Reingold, Eyal M

    2017-03-01

    To explore the perceptual component of chess expertise, we monitored the eye movements of expert and novice chess players during a chess-related visual search task that tested anecdotal reports that a key differentiator of chess skill is the ability to visualize the complex moves of the knight piece. Specifically, chess players viewed an array of four minimized chessboards, and they rapidly searched for the target board that allowed a knight piece to reach a target square in three moves. On each trial, there was only one target board (i.e., the "Yes" board), and for the remaining "lure" boards, the knight's path was blocked on either the first move (the "Easy No" board) or the second move (i.e., "the Difficult No" board). As evidence that chess experts can rapidly differentiate complex chess-related visual patterns, the experts (but not the novices) showed longer first-fixation durations on the "Yes" board relative to the "Difficult No" board. Moreover, as hypothesized, the task strongly differentiated chess skill: Reaction times were more than four times faster for the experts relative to novices, and reaction times were correlated with within-group measures of expertise (i.e., official chess ratings, number of hours of practice). These results indicate that a key component of chess expertise is the ability to rapidly recognize complex visual patterns.

  9. A novel brain-computer interface based on the rapid serial visual presentation paradigm.

    Science.gov (United States)

    Acqualagna, Laura; Treder, Matthias Sebastian; Schreuder, Martijn; Blankertz, Benjamin

    2010-01-01

    Most present-day visual brain computer interfaces (BCIs) suffer from the fact that they rely on eye movements, are slow-paced, or feature a small vocabulary. As a potential remedy, we explored a novel BCI paradigm consisting of a central rapid serial visual presentation (RSVP) of the stimuli. It has a large vocabulary and realizes a BCI system based on covert non-spatial selective visual attention. In an offline study, eight participants were presented sequences of rapid bursts of symbols. Two different speeds and two different color conditions were investigated. Robust early visual and P300 components were elicited time-locked to the presentation of the target. Offline classification revealed a mean accuracy of up to 90% for selecting the correct symbol out of 30 possibilities. The results suggest that RSVP-BCI is a promising new paradigm, also for patients with oculomotor impairments.

  10. Interactive visual steering--rapid visual prototyping of a common rail injection system.

    Science.gov (United States)

    Matković, Kresimir; Gracanin, Denis; Jelović, Mario; Hauser, Helwig

    2008-01-01

    Interactive steering with visualization has been a common goal of the visualization research community for twenty years, but it is rarely ever realized in practice. In this paper we describe a successful realization of a tightly coupled steering loop, integrating new simulation technology and interactive visual analysis in a prototyping environment for automotive industry system design. Due to increasing pressure on car manufacturers to meet new emission regulations, to improve efficiency, and to reduce noise, both simulation and visualization are pushed to their limits. Automotive system components, such as the powertrain system or the injection system have an increasing number of parameters, and new design approaches are required. It is no longer possible to optimize such a system solely based on experience or forward optimization. By coupling interactive visualization with the simulation back-end (computational steering), it is now possible to quickly prototype a new system, starting from a non-optimized initial prototype and the corresponding simulation model. The prototyping continues through the refinement of the simulation model, of the simulation parameters and through trial-and-error attempts to an optimized solution. The ability to early see the first results from a multidimensional simulation space--thousands of simulations are run for a multidimensional variety of input parameters--and to quickly go back into the simulation and request more runs in particular parameter regions of interest significantly improves the prototyping process and provides a deeper understanding of the system behavior. The excellent results which we achieved for the common rail injection system strongly suggest that our approach has a great potential of being generalized to other, similar scenarios.

  11. Rapid visualization of latent fingermarks using gold seed-mediated enhancement

    Directory of Open Access Journals (Sweden)

    Chia-Hao Su

    2016-11-01

    Full Text Available Abstract Background Fingermarks are one of the most important and useful forms of physical evidence in forensic investigations. However, latent fingermarks are not directly visible, but can be visualized due to the presence of other residues (such as inorganic salts, proteins, polypeptides, enzymes and human metabolites which can be detected or recognized through various strategies. Convenient and rapid techniques are still needed to provide obvious contrast between the background and the fingermark ridges and to then visualize latent fingermark with a high degree of selectivity and sensitivity. Results In this work, lysozyme-binding aptamer-conjugated Au nanoparticles (NPs are used to recognize and target lysozyme in the fingermark ridges, and Au+-complex solution is used as a growth agent to reduce Au+ from Au+ to Au0 on the surface of the Au NPs. Distinct fingermark patterns were visualized on a range of professional forensic within 3 min; the resulting images could be observed by the naked eye without background interference. The entire processes from fingermark collection to visualization only entails two steps and can be completed in less than 10 min. The proposed method provides cost and time savings over current fingermark visualization methods. Conclusions We report a simple, inexpensive, and fast method for the rapid visualization of latent fingermarks on the non-porous substrates using Au seed-mediated enhancement. Au seed-mediated enhancement is used to achieve the rapid visualization of latent fingermarks on non-porous substrates by the naked eye without the use of expensive or sophisticated instruments. The proposed approach offers faster detection and visualization of latent fingermarks than existing methods. The proposed method is expected to increase detection efficiency for latent fingermarks and reduce time requirements and costs for forensic investigations.

  12. Rapid Resumption of Interrupted Search Is Independent of Age-Related Improvements in Visual Search

    Science.gov (United States)

    Lleras, Alejandro; Porporino, Mafalda; Burack, Jacob A.; Enns, James T.

    2011-01-01

    In this study, 7-19-year-olds performed an interrupted visual search task in two experiments. Our question was whether the tendency to respond within 500 ms after a second glimpse of a display (the "rapid resumption" effect ["Psychological Science", 16 (2005) 684-688]) would increase with age in the same way as overall search efficiency. The…

  13. Detecting and Remembering Simultaneous Pictures in a Rapid Serial Visual Presentation

    Science.gov (United States)

    Potter, Mary C.; Fox, Laura F.

    2009-01-01

    Viewers can easily spot a target picture in a rapid serial visual presentation (RSVP), but can they do so if more than 1 picture is presented simultaneously? Up to 4 pictures were presented on each RSVP frame, for 240 to 720 ms/frame. In a detection task, the target was verbally specified before each trial (e.g., "man with violin"); in a…

  14. Reading Time Allocation Strategies and Working Memory Using Rapid Serial Visual Presentation

    Science.gov (United States)

    Busler, Jessica N.; Lazarte, Alejandro A.

    2017-01-01

    Rapid serial visual presentation (RSVP) is a useful method for controlling the timing of text presentations and studying how readers' characteristics, such as working memory (WM) and reading strategies for time allocation, influence text recall. In the current study, a modified version of RSVP (Moving Window RSVP [MW-RSVP]) was used to induce…

  15. Rapid modeling of complex multi-fault ruptures with simplistic models from real-time GPS: Perspectives from the 2016 Mw 7.8 Kaikoura earthquake

    Science.gov (United States)

    Crowell, B.; Melgar, D.

    2017-12-01

    The 2016 Mw 7.8 Kaikoura earthquake is one of the most complex earthquakes in recent history, rupturing across at least 10 disparate faults with varying faulting styles, and exhibiting intricate surface deformation patterns. The complexity of this event has motivated the need for multidisciplinary geophysical studies to get at the underlying source physics to better inform earthquake hazards models in the future. However, events like Kaikoura beg the question of how well (or how poorly) such earthquakes can be modeled automatically in real-time and still satisfy the general public and emergency managers. To investigate this question, we perform a retrospective real-time GPS analysis of the Kaikoura earthquake with the G-FAST early warning module. We first perform simple point source models of the earthquake using peak ground displacement scaling and a coseismic offset based centroid moment tensor (CMT) inversion. We predict ground motions based on these point sources as well as simple finite faults determined from source scaling studies, and validate against true recordings of peak ground acceleration and velocity. Secondly, we perform a slip inversion based upon the CMT fault orientations and forward model near-field tsunami maximum expected wave heights to compare against available tide gauge records. We find remarkably good agreement between recorded and predicted ground motions when using a simple fault plane, with the majority of disagreement in ground motions being attributable to local site effects, not earthquake source complexity. Similarly, the near-field tsunami maximum amplitude predictions match tide gauge records well. We conclude that even though our models for the Kaikoura earthquake are devoid of rich source complexities, the CMT driven finite fault is a good enough "average" source and provides useful constraints for rapid forecasting of ground motion and near-field tsunami amplitudes.

  16. The population in China’s earthquake-prone areas has increased by over 32 million along with rapid urbanization

    Science.gov (United States)

    He, Chunyang; Huang, Qingxu; Dou, Yinyin; Tu, Wei; Liu, Jifu

    2016-07-01

    Accurate assessments of the population exposed to seismic hazard are crucial in seismic risk mapping. Recent rapid urbanization in China has resulted in substantial changes in the size and structure of the population exposed to seismic hazard. Using the latest population census data and seismic maps, this work investigated spatiotemporal changes in the exposure of the population in the most seismically hazardous areas (MSHAs) in China from 1990 to 2010. In the context of rapid urbanization and massive rural-to-urban migration, nearly one-tenth of the Chinese population in 2010 lived in MSHAs. From 1990 to 2010, the MSHA population increased by 32.53 million at a significantly higher rate of change (33.6%) than the national average rate (17.7%). The elderly population in MSHAs increased by 81.4%, which is much higher than the group’s national growth rate of 58.9%. Greater attention should be paid to the demographic changes in earthquake-prone areas in China.

  17. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude

  18. Rapid steroid influences on visually guided sexual behavior in male goldfish

    Science.gov (United States)

    Lord, Louis-David; Bond, Julia; Thompson, Richmond R.

    2013-01-01

    The ability of steroid hormones to rapidly influence cell physiology through nongenomic mechanisms raises the possibility that these molecules may play a role in the dynamic regulation of social behavior, particularly in species in which social stimuli can rapidly influence circulating steroid levels. We therefore tested if testosterone (T), which increases in male goldfish in response to sexual stimuli, can rapidly influence approach responses towards females. Injections of T stimulated approach responses towards the visual cues of females 30–45 min after the injection but did not stimulate approach responses towards stimulus males or affect general activity, indicating that the effect is stimulus-specific and not a secondary consequence of increased arousal. Estradiol produced the same effect 30–45 min and even 10–25 min after administration, and treatment with the aromatase inhibitor fadrozole blocked exogenous T’s behavioral effect, indicating that T’s rapid stimulation of visual approach responses depends on aromatization. We suggest that T surges induced by sexual stimuli, including preovulatory pheromones, rapidly prime males to mate by increasing sensitivity within visual pathways that guide approach responses towards females and/or by increasing the motivation to approach potential mates through actions within traditional limbic circuits. PMID:19751737

  19. A portfolio of products from the rapid terrain visualization interferometric SAR

    Science.gov (United States)

    Bickel, Douglas L.; Doerry, Armin W.

    2007-04-01

    The Rapid Terrain Visualization interferometric synthetic aperture radar was designed and built at Sandia National Laboratories as part of an Advanced Concept Technology Demonstration (ACTD) to "demonstrate the technologies and infrastructure to meet the Army requirement for rapid generation of digital topographic data to support emerging crisis or contingencies." This sensor was built by Sandia National Laboratories for the Joint Programs Sustainment and Development (JPSD) Project Office to provide highly accurate digital elevation models (DEMs) for military and civilian customers, both inside and outside of the United States. The sensor achieved better than HRTe Level IV position accuracy in near real-time. The system was flown on a deHavilland DHC-7 Army aircraft. This paper presents a collection of images and data products from the Rapid Terrain Visualization interferometric synthetic aperture radar. The imagery includes orthorectified images and DEMs from the RTV interferometric SAR radar.

  20. Visualization of the fault slip connected with the West Bohemia earthquake swarms

    Czech Academy of Sciences Publication Activity Database

    Kolář, Petr; Růžek, Bohuslav; Boušková, Alena; Horálek, Josef

    2011-01-01

    Roč. 8, č. 2 (2011), s. 169-187 ISSN 1214-9705 R&D Projects: GA AV ČR(CZ) IAA300120805; GA ČR GAP210/10/1728 Institutional research plan: CEZ:AV0Z30120515 Keywords : West Bohemia earthquake swarm * fault slip * fault dynamics * asperity Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.530, year: 2011

  1. Easy and Rapid Detection of Mumps Virus by Live Fluorescent Visualization of Virus-Infected Cells.

    Directory of Open Access Journals (Sweden)

    Tadanobu Takahashi

    Full Text Available Mumps viruses show diverse cytopathic effects (CPEs of infected cells and viral plaque formation (no CPE or no plaque formation in some cases depending on the viral strain, highlighting the difficulty in mumps laboratory studies. In our previous study, a new sialidase substrate, 2-(benzothiazol-2-yl-4-bromophenyl 5-acetamido-3,5-dideoxy-α-D-glycero-D-galacto-2-nonulopyranosidonic acid (BTP3-Neu5Ac, was developed for visualization of sialidase activity. BTP3-Neu5Ac can easily and rapidly perform histochemical fluorescent visualization of influenza viruses and virus-infected cells without an antiviral antibody and cell fixation. In the present study, the potential utility of BTP3-Neu5Ac for rapid detection of mumps virus was demonstrated. BTP3-Neu5Ac could visualize dot-blotted mumps virus, virus-infected cells, and plaques (plaques should be called focuses due to staining of infected cells in this study, even if a CPE was not observed. Furthermore, virus cultivation was possible by direct pick-up from a fluorescent focus. In conventional methods, visible appearance of the CPE and focuses often requires more than 6 days after infection, but the new method with BTP3-Neu5Ac clearly visualized infected cells after 2 days and focuses after 4 days. The BTP3-Neu5Ac assay is a precise, easy, and rapid assay for confirmation and titration of mumps virus.

  2. Visual Perceptual Echo Reflects Learning of Regularities in Rapid Luminance Sequences.

    Science.gov (United States)

    Chang, Acer Y-C; Schwartzman, David J; VanRullen, Rufin; Kanai, Ryota; Seth, Anil K

    2017-08-30

    A novel neural signature of active visual processing has recently been described in the form of the "perceptual echo", in which the cross-correlation between a sequence of randomly fluctuating luminance values and occipital electrophysiological signals exhibits a long-lasting periodic (∼100 ms cycle) reverberation of the input stimulus (VanRullen and Macdonald, 2012). As yet, however, the mechanisms underlying the perceptual echo and its function remain unknown. Reasoning that natural visual signals often contain temporally predictable, though nonperiodic features, we hypothesized that the perceptual echo may reflect a periodic process associated with regularity learning. To test this hypothesis, we presented subjects with successive repetitions of a rapid nonperiodic luminance sequence, and examined the effects on the perceptual echo, finding that echo amplitude linearly increased with the number of presentations of a given luminance sequence. These data suggest that the perceptual echo reflects a neural signature of regularity learning.Furthermore, when a set of repeated sequences was followed by a sequence with inverted luminance polarities, the echo amplitude decreased to the same level evoked by a novel stimulus sequence. Crucially, when the original stimulus sequence was re-presented, the echo amplitude returned to a level consistent with the number of presentations of this sequence, indicating that the visual system retained sequence-specific information, for many seconds, even in the presence of intervening visual input. Altogether, our results reveal a previously undiscovered regularity learning mechanism within the human visual system, reflected by the perceptual echo. SIGNIFICANCE STATEMENT How the brain encodes and learns fast-changing but nonperiodic visual input remains unknown, even though such visual input characterizes natural scenes. We investigated whether the phenomenon of "perceptual echo" might index such learning. The perceptual echo is a

  3. Alpha-1 adrenergic receptors gate rapid orientation-specific reduction in visual discrimination.

    Science.gov (United States)

    Treviño, Mario; Frey, Sebastian; Köhr, Georg

    2012-11-01

    Prolonged imbalance in sensory experience leads to dramatic readjustments in cortical representation. Neuromodulatory systems play a critical role in habilitating experience-induced plasticity and regulate memory processes in vivo. Here, we show that a brief period of intense patterned visual stimulation combined with systemic activation of alpha-1 adrenergic neuromodulator receptors (α(1)-ARs) leads to a rapid, reversible, and NMDAR-dependent depression of AMPAR-mediated transmission from ascending inputs to layer II/III pyramidal cells in the visual cortex of young and adult mice. The magnitude of this form of α(1)-AR long-term depression (LTD), measured ex vivo with miniature EPSC recordings, is graded by the number of orientations used during visual experience. Moreover, behavioral tests of visual function following the induction of α(1)-AR LTD reveal that discrimination accuracy of sinusoidal drifting gratings is selectively reduced at high spatial frequencies in a reversible, orientation-specific, and NMDAR-dependent manner. Thus, α(1)-ARs enable rapid cortical synaptic depression which correlates with an orientation-specific decrease in visual discrimination. These findings contribute to our understanding of how adrenergic receptors interact with neuronal networks in response to changes in active sensory experience to produce adaptive behavior.

  4. The Effect of Orthographic Depth on Letter String Processing: The Case of Visual Attention Span and Rapid Automatized Naming

    Science.gov (United States)

    Antzaka, Alexia; Martin, Clara; Caffarra, Sendy; Schlöffel, Sophie; Carreiras, Manuel; Lallier, Marie

    2018-01-01

    The present study investigated whether orthographic depth can increase the bias towards multi-letter processing in two reading-related skills: visual attention span (VAS) and rapid automatized naming (RAN). VAS (i.e., the number of visual elements that can be processed at once in a multi-element array) was tested with a visual 1-back task and RAN…

  5. Rapid assessment of visual impairment in urban population of Delhi, India.

    Science.gov (United States)

    Gupta, Noopur; Vashist, Praveen; Malhotra, Sumit; Senjam, Suraj Singh; Misra, Vasundhara; Bhardwaj, Amit

    2015-01-01

    To determine the prevalence, causes and associated demographic factors related to visual impairment amongst the urban population of New Delhi, India. A population-based, cross-sectional study was conducted in East Delhi district using cluster random sampling methodology. This Rapid Assessment of Visual Impairment (RAVI) survey involved examination of all individuals aged 40 years and above in 24 randomly selected clusters of the district. Visual acuity (VA) assessment and comprehensive ocular examination were done during the door-to-door survey. A questionnaire was used to collect personal and demographic information of the study population. Blindness and Visual Impairment was defined as presenting VA visual impairment. Of 2421 subjects enumerated, 2331 (96.3%) were available for ophthalmic examination. Among those examined, 49.3% were males. The prevalence of visual impairment (VI) in the study population, was 11.4% (95% C.I. 10.1, 12.7) and that of blindness was 1.2% (95% C.I. 0.8, 1.6). Uncorrected refractive error was the leading cause of VI accounting for 53.4% of all VI followed by cataract (33.8%). With multivariable logistic regression, the odds of having VI increased with age (OR = 24.6[95% C.I.: 14.9, 40.7]; p visual impairment is considerable in this region despite availability of adequate eye care facilities. Awareness generation and simple interventions like cataract surgery and provision of spectacles will help to eliminate the major causes of blindness and visual impairment in this region.

  6. New Perspectives on Active Tectonics: Observing Fault Motion, Mapping Earthquake Strain Fields, and Visualizing Seismic Events in Multiple Dimensions Using Satellite Imagery and Geophysical Data Base

    Science.gov (United States)

    Crippen, R.; Blom, R.

    1994-01-01

    By rapidly alternating displays of SPOT satellite images acquired on 27 July 1991 and 25 July 1992 we are able to see spatial details of terrain movements along fault breaks associated with the 28 June 1992 Landers, California earthquake that are virtually undetectable by any other means.

  7. The Temblor mobile seismic risk app, v2: Rapid and seamless earthquake information to inspire individuals to recognize and reduce their risk

    Science.gov (United States)

    Stein, R. S.; Sevilgen, V.; Sevilgen, S.; Kim, A.; Jacobson, D. S.; Lotto, G. C.; Ely, G.; Bhattacharjee, G.; O'Sullivan, J.

    2017-12-01

    Temblor quantifies and personalizes earthquake risk and offers solutions by connecting users with qualified retrofit and insurance providers. Temblor's daily blog on current earthquakes, seismic swarms, eruptions, floods, and landslides makes the science accessible to the public. Temblor is available on iPhone, Android, and mobile web app platforms (http://temblor.net). The app presents both scenario (worst case) and probabilistic (most likely) financial losses for homes and commercial buildings, and estimates the impact of seismic retrofit and insurance on the losses and safety. Temblor's map interface has clickable earthquakes (with source parameters and links) and active faults (name, type, and slip rate) around the world, and layers for liquefaction, landslides, tsunami inundation, and flood zones in the U.S. The app draws from the 2014 USGS National Seismic Hazard Model and the 2014 USGS Building Seismic Safety Council ShakeMap scenari0 database. The Global Earthquake Activity Rate (GEAR) model is used worldwide, with active faults displayed in 75 countries. The Temblor real-time global catalog is merged from global and national catalogs, with aftershocks discriminated from mainshocks. Earthquake notifications are issued to Temblor users within 30 seconds of their occurrence, with approximate locations and magnitudes that are rapidly refined in the ensuing minutes. Launched in 2015, Temblor has 650,000 unique users, including 250,000 in the U.S. and 110,000 in Chile, as well as 52,000 Facebook followers. All data shown in Temblor is gathered from authoritative or published sources and is synthesized to be intuitive and actionable to the public. Principal data sources include USGS, FEMA, EMSC, GEM Foundation, NOAA, GNS Science (New Zealand), INGV (Italy), PHIVOLCS (Philippines), GSJ (Japan), Taiwan Earthquake Model, EOS Singapore (Southeast Asia), MTA (Turkey), PB2003 (plate boundaries), CICESE (Baja California), California Geological Survey, and 20 other state

  8. Deep Residual Network Predicts Cortical Representation and Organization of Visual Features for Rapid Categorization.

    Science.gov (United States)

    Wen, Haiguang; Shi, Junxing; Chen, Wei; Liu, Zhongming

    2018-02-28

    The brain represents visual objects with topographic cortical patterns. To address how distributed visual representations enable object categorization, we established predictive encoding models based on a deep residual network, and trained them to predict cortical responses to natural movies. Using this predictive model, we mapped human cortical representations to 64,000 visual objects from 80 categories with high throughput and accuracy. Such representations covered both the ventral and dorsal pathways, reflected multiple levels of object features, and preserved semantic relationships between categories. In the entire visual cortex, object representations were organized into three clusters of categories: biological objects, non-biological objects, and background scenes. In a finer scale specific to each cluster, object representations revealed sub-clusters for further categorization. Such hierarchical clustering of category representations was mostly contributed by cortical representations of object features from middle to high levels. In summary, this study demonstrates a useful computational strategy to characterize the cortical organization and representations of visual features for rapid categorization.

  9. Earthquakes in Action: Incorporating Multimedia, Internet Resources, Large-scale Seismic Data, and 3-D Visualizations into Innovative Activities and Research Projects for Today's High School Students

    Science.gov (United States)

    Smith-Konter, B.; Jacobs, A.; Lawrence, K.; Kilb, D.

    2006-12-01

    The most effective means of communicating science to today's "high-tech" students is through the use of visually attractive and animated lessons, hands-on activities, and interactive Internet-based exercises. To address these needs, we have developed Earthquakes in Action, a summer high school enrichment course offered through the California State Summer School for Mathematics and Science (COSMOS) Program at the University of California, San Diego. The summer course consists of classroom lectures, lab experiments, and a final research project designed to foster geophysical innovations, technological inquiries, and effective scientific communication (http://topex.ucsd.edu/cosmos/earthquakes). Course content includes lessons on plate tectonics, seismic wave behavior, seismometer construction, fault characteristics, California seismicity, global seismic hazards, earthquake stress triggering, tsunami generation, and geodetic measurements of the Earth's crust. Students are introduced to these topics through lectures-made-fun using a range of multimedia, including computer animations, videos, and interactive 3-D visualizations. These lessons are further enforced through both hands-on lab experiments and computer-based exercises. Lab experiments included building hand-held seismometers, simulating the frictional behavior of faults using bricks and sandpaper, simulating tsunami generation in a mini-wave pool, and using the Internet to collect global earthquake data on a daily basis and map earthquake locations using a large classroom map. Students also use Internet resources like Google Earth and UNAVCO/EarthScope's Jules Verne Voyager Jr. interactive mapping tool to study Earth Science on a global scale. All computer-based exercises and experiments developed for Earthquakes in Action have been distributed to teachers participating in the 2006 Earthquake Education Workshop, hosted by the Visualization Center at Scripps Institution of Oceanography (http

  10. Visual and colorimetric methods for rapid determination of total tannins in vegetable raw materials

    Directory of Open Access Journals (Sweden)

    S. P. Kalinkina

    2016-01-01

    Full Text Available The article is dedicated to the development of rapid colorimetric method for determining the amount of tannins in aqueous extracts of vegetable raw materials. The sorption-based colorimetric test is determining sorption tannins polyurethane foam, impregnated of FeCl3, receiving on its surface painted in black and green color of the reaction products and the determination of their in sorbent matrix. Selectivity is achieved by determining the tannins specific interaction of polyphenols with iron ions (III. The conditions of sorption-colorimetric method: the concentration of ferric chloride (III, impregnated in the polyurethane foam; sorbent mass in the analytical cartridge; degree of loading his agent; the contact time of the phases. color scales have been developed for the visual determination of the amount of tannins in terms of gallic acid. Spend a digitized image obtained scales using computer program “Sorbfil TLC”, excluding a subjective assessment of the intensity of the color scale of the test. The results obtained determine the amount of tannins in aqueous extracts of vegetable raw rapid method using tablets and analytical cartridges. The results of the test determination of tannins with visual and densitometric analytical signal registration are compared to known methods. Spend a metrological evaluation of the results of determining the amount of tannins sorption rapid colorimetric methods. Time visual and densitometric rapid determination of tannins, taking into account the sample preparation is 25–30 minutes, the relative error does not exceed 28 %. The developed test methods for quantifying the content of tannins allow to exclude the use of sophisticated analytical equipment, carry out the analysis in non-laboratory conditions do not require highly skilled personnel.

  11. Rapid Recovery of Visual Function Associated with Blue Cone Ablation in Zebrafish

    Science.gov (United States)

    Hagerman, Gordon F.; Noel, Nicole C. L.; Cao, Sylvia Y.; DuVal, Michèle G.; Oel, A. Phillip; Allison, W. Ted

    2016-01-01

    Hurdles in the treatment of retinal degeneration include managing the functional rewiring of surviving photoreceptors and integration of any newly added cells into the remaining second-order retinal neurons. Zebrafish are the premier genetic model for such questions, and we present two new transgenic lines allowing us to contrast vision loss and recovery following conditional ablation of specific cone types: UV or blue cones. The ablation of each cone type proved to be thorough (killing 80% of cells in each intended cone class), specific, and cell-autonomous. We assessed the loss and recovery of vision in larvae via the optomotor behavioural response (OMR). This visually mediated behaviour decreased to about 5% or 20% of control levels following ablation of UV or blue cones, respectively (Pvision recovery following UV cone ablation was robust, as measured by both assays, returning to control levels within four days. In contrast, robust functional recovery following blue cone ablation was unexpectedly rapid, returning to normal levels within 24 hours after ablation. Ablation of cones led to increased proliferation in the retina, though the rapid recovery of vision following blue cone ablation was demonstrated to not be mediated by blue cone regeneration. Thus rapid visual recovery occurs following ablation of some, but not all, cone subtypes, suggesting an opportunity to contrast and dissect the sources and mechanisms of outer retinal recovery during cone photoreceptor death and regeneration. PMID:27893779

  12. Age-Related Declines in Early Sensory Memory: Identification of Rapid Auditory and Visual Stimulus Sequences.

    Science.gov (United States)

    Fogerty, Daniel; Humes, Larry E; Busey, Thomas A

    2016-01-01

    Age-related temporal-processing declines of rapidly presented sequences may involve contributions of sensory memory. This study investigated recall for rapidly presented auditory (vowel) and visual (letter) sequences presented at six different stimulus onset asynchronies (SOA) that spanned threshold SOAs for sequence identification. Younger, middle-aged, and older adults participated in all tasks. Results were investigated at both equivalent performance levels (i.e., SOA threshold) and at identical physical stimulus values (i.e., SOAs). For four-item sequences, results demonstrated best performance for the first and last items in the auditory sequences, but only the first item for visual sequences. For two-item sequences, adults identified the second vowel or letter significantly better than the first. Overall, when temporal-order performance was equated for each individual by testing at SOA thresholds, recall accuracy for each position across the age groups was highly similar. These results suggest that modality-specific processing declines of older adults primarily determine temporal-order performance for rapid sequences. However, there is some evidence for a second amodal processing decline in older adults related to early sensory memory for final items in a sequence. This selective deficit was observed particularly for longer sequence lengths and was not accounted for by temporal masking.

  13. RAPID EXTRACTION OF LANDSLIDE AND SPATIAL DISTRIBUTION ANALYSIS AFTER JIUZHAIGOU Ms7.0 EARTHQUAKE BASED ON UAV IMAGES

    OpenAIRE

    Q. S. Jiao; Y. Luo; W. H. Shen; Q. Li; X. Wang

    2018-01-01

    Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV) and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented an...

  14. Rapid Eye Movements (REMs) and visual dream recall in both congenitally blind and sighted subjects

    Science.gov (United States)

    Bértolo, Helder; Mestre, Tiago; Barrio, Ana; Antona, Beatriz

    2017-08-01

    Our objective was to evaluate rapid eye movements (REMs) associated with visual dream recall in sighted subjects and congenital blind. During two consecutive nights polysomnographic recordings were performed at subjects home. REMs were detected by visual inspection on both EOG channels (EOG-H, EOG-V) and further classified as occurring isolated or in bursts. Dream recall was defined by the existence of a dream report. The two groups were compared using t-test and also the two-way ANOVA and a post-hoc Fisher test (for the features diagnosis (blind vs. sighted) and dream recall (yes or no) as a function of time). The average of REM awakenings per subject and the recall ability were identical in both groups. CB had a lower REM density than CS; the same applied to REM bursts and isolated eye movements. In the two-way ANOVA, REM bursts and REM density were significantly different for positive dream recall, mainly for the CB group and for diagnosis; furthermore for both features significant results were obtained for the interaction of time, recall and diagnosis; the interaction of recall and time was however, stronger. In line with previous findings the data show that blind have lower REMs density. However the ability of dream recall in congenitally blind and sighted controls is identical. In both groups visual dream recall is associated with an increase in REM bursts and density. REM bursts also show differences in the temporal profile. REM visual dream recall is associated with increased REMs activity.

  15. Visual processing in rapid-chase systems: Image processing, attention, and awareness

    Directory of Open Access Journals (Sweden)

    Thomas eSchmidt

    2011-07-01

    Full Text Available Visual stimuli can be classified so rapidly that their analysis may be based on a single sweep of feedforward processing through the visuomotor system. Behavioral criteria for feedforward processing can be evaluated in response priming tasks where speeded pointing or keypress responses are performed towards target stimuli which are preceded by prime stimuli. We apply this method to several classes of complex stimuli. 1 When participants classify natural images into animals or non-animals, the time course of their pointing responses indicates that prime and target signals remain strictly sequential throughout all processing stages, meeting stringent behavioral criteria for feedforward processing (rapid-chase criteria. 2 Such priming effects are boosted by selective visual attention for positions, shapes, and colors, in a way consistent with bottom-up enhancement of visuomotor processing, even when primes cannot be consciously identified. 3 Speeded processing of phobic images is observed in participants specifically fearful of spiders or snakes, suggesting enhancement of feedforward processing by long-term perceptual learning. 4 When the perceived brightness of primes in complex displays is altered by means of illumination or transparency illusions, priming effects in speeded keypress responses can systematically contradict subjective brightness judgments, such that one prime appears brighter than the other but activates motor responses as if it was darker. We propose that response priming captures the output of the first feedforward pass of visual signals through the visuomotor system, and that this output lacks some characteristic features of more elaborate, recurrent processing. This way, visuomotor measures may become dissociated from several aspects of conscious vision. We argue that "fast" visuomotor measures predominantly driven by feedforward processing should supplement "slow" psychophysical measures predominantly based on visual

  16. Rapid processing of data based on high-performance algorithms for solving inverse problems and 3D-simulation of the tsunami and earthquakes

    Science.gov (United States)

    Marinin, I. V.; Kabanikhin, S. I.; Krivorotko, O. I.; Karas, A.; Khidasheli, D. G.

    2012-04-01

    We consider new techniques and methods for earthquake and tsunami related problems, particularly - inverse problems for the determination of tsunami source parameters, numerical simulation of long wave propagation in soil and water and tsunami risk estimations. In addition, we will touch upon the issue of database management and destruction scenario visualization. New approaches and strategies, as well as mathematical tools and software are to be shown. The long joint investigations by researchers of the Institute of Mathematical Geophysics and Computational Mathematics SB RAS and specialists from WAPMERR and Informap have produced special theoretical approaches, numerical methods, and software tsunami and earthquake modeling (modeling of propagation and run-up of tsunami waves on coastal areas), visualization, risk estimation of tsunami, and earthquakes. Algorithms are developed for the operational definition of the origin and forms of the tsunami source. The system TSS numerically simulates the source of tsunami and/or earthquakes and includes the possibility to solve the direct and the inverse problem. It becomes possible to involve advanced mathematical results to improve models and to increase the resolution of inverse problems. Via TSS one can construct maps of risks, the online scenario of disasters, estimation of potential damage to buildings and roads. One of the main tools for the numerical modeling is the finite volume method (FVM), which allows us to achieve stability with respect to possible input errors, as well as to achieve optimum computing speed. Our approach to the inverse problem of tsunami and earthquake determination is based on recent theoretical results concerning the Dirichlet problem for the wave equation. This problem is intrinsically ill-posed. We use the optimization approach to solve this problem and SVD-analysis to estimate the degree of ill-posedness and to find the quasi-solution. The software system we developed is intended to

  17. Rapid and Parallel Adaptive Evolution of the Visual System of Neotropical Midas Cichlid Fishes.

    Science.gov (United States)

    Torres-Dowdall, Julián; Pierotti, Michele E R; Härer, Andreas; Karagic, Nidal; Woltering, Joost M; Henning, Frederico; Elmer, Kathryn R; Meyer, Axel

    2017-10-01

    Midas cichlid fish are a Central American species flock containing 13 described species that has been dated to only a few thousand years old, a historical timescale infrequently associated with speciation. Their radiation involved the colonization of several clear water crater lakes from two turbid great lakes. Therefore, Midas cichlids have been subjected to widely varying photic conditions during their radiation. Being a primary signal relay for information from the environment to the organism, the visual system is under continuing selective pressure and a prime organ system for accumulating adaptive changes during speciation, particularly in the case of dramatic shifts in photic conditions. Here, we characterize the full visual system of Midas cichlids at organismal and genetic levels, to determine what types of adaptive changes evolved within the short time span of their radiation. We show that Midas cichlids have a diverse visual system with unexpectedly high intra- and interspecific variation in color vision sensitivity and lens transmittance. Midas cichlid populations in the clear crater lakes have convergently evolved visual sensitivities shifted toward shorter wavelengths compared with the ancestral populations from the turbid great lakes. This divergence in sensitivity is driven by changes in chromophore usage, differential opsin expression, opsin coexpression, and to a lesser degree by opsin coding sequence variation. The visual system of Midas cichlids has the evolutionary capacity to rapidly integrate multiple adaptations to changing light environments. Our data may indicate that, in early stages of divergence, changes in opsin regulation could precede changes in opsin coding sequence evolution. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  18. Seismogeodesy of the 2014 Mw6.1 Napa earthquake, California: Rapid response and modeling of fast rupture on a dipping strike-slip fault

    Science.gov (United States)

    Melgar, Diego; Geng, Jianghui; Crowell, Brendan W.; Haase, Jennifer S.; Bock, Yehuda; Hammond, William C.; Allen, Richard M.

    2015-07-01

    Real-time high-rate geodetic data have been shown to be useful for rapid earthquake response systems during medium to large events. The 2014 Mw6.1 Napa, California earthquake is important because it provides an opportunity to study an event at the lower threshold of what can be detected with GPS. We show the results of GPS-only earthquake source products such as peak ground displacement magnitude scaling, centroid moment tensor (CMT) solution, and static slip inversion. We also highlight the retrospective real-time combination of GPS and strong motion data to produce seismogeodetic waveforms that have higher precision and longer period information than GPS-only or seismic-only measurements of ground motion. We show their utility for rapid kinematic slip inversion and conclude that it would have been possible, with current real-time infrastructure, to determine the basic features of the earthquake source. We supplement the analysis with strong motion data collected close to the source to obtain an improved postevent image of the source process. The model reveals unilateral fast propagation of slip to the north of the hypocenter with a delayed onset of shallow slip. The source model suggests that the multiple strands of observed surface rupture are controlled by the shallow soft sediments of Napa Valley and do not necessarily represent the intersection of the main faulting surface and the free surface. We conclude that the main dislocation plane is westward dipping and should intersect the surface to the east, either where the easternmost strand of surface rupture is observed or at the location where the West Napa fault has been mapped in the past.

  19. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    Science.gov (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  20. CISN Display - Reliable Delivery of Real-time Earthquake Information, Including Rapid Notification and ShakeMap to Critical End Users

    Science.gov (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Given, D.

    2002-12-01

    The California Integrated Seismic Network (CISN) Display is part of a Web-enabled earthquake notification system alerting users in near real-time of seismicity, and also valuable geophysical information following a large earthquake. It will replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering graphical earthquake information to users at emergency operations centers, and other organizations. Features distinguishing the CISN Display from other GUI tools are a state-full client/server relationship, a scalable message format supporting automated hyperlink creation, and a configurable platform-independent client with a GIS mapping tool; supporting the decision-making activities of critical users. The CISN Display is the front-end of a client/server architecture known as the QuakeWatch system. It is comprised of the CISN Display (and other potential clients), message queues, server, server "feeder" modules, and messaging middleware, schema and generators. It is written in Java, making it platform-independent, and offering the latest in Internet technologies. QuakeWatch's object-oriented design allows components to be easily upgraded through a well-defined set of application programming interfaces (APIs). Central to the CISN Display's role as a gateway to other earthquake products is its comprehensive XML-schema. The message model starts with the CUBE message format, but extends it by provisioning additional attributes for currently available products, and those yet to be considered. The supporting metadata in the XML-message provides the data necessary for the client to create a hyperlink and associate it with a unique event ID. Earthquake products deliverable to the CISN Display are ShakeMap, Ground Displacement, Focal Mechanisms, Rapid Notifications, OES Reports, and Earthquake Commentaries. Leveraging the power of the XML-format, the CISN Display provides prompt access to

  1. Saccade-synchronized rapid attention shifts in macaque visual cortical area MT.

    Science.gov (United States)

    Yao, Tao; Treue, Stefan; Krishna, B Suresh

    2018-03-06

    While making saccadic eye-movements to scan a visual scene, humans and monkeys are able to keep track of relevant visual stimuli by maintaining spatial attention on them. This ability requires a shift of attentional modulation from the neuronal population representing the relevant stimulus pre-saccadically to the one representing it post-saccadically. For optimal performance, this trans-saccadic attention shift should be rapid and saccade-synchronized. Whether this is so is not known. We trained two rhesus monkeys to make saccades while maintaining covert attention at a fixed spatial location. We show that the trans-saccadic attention shift in cortical visual medial temporal (MT) area is well synchronized to saccades. Attentional modulation crosses over from the pre-saccadic to the post-saccadic neuronal representation by about 50 ms after a saccade. Taking response latency into account, the trans-saccadic attention shift is well timed to maintain spatial attention on relevant stimuli, so that they can be optimally tracked and processed across saccades.

  2. The left visual-field advantage in rapid visual presentation is amplified rather than reduced by posterior-parietal rTMS

    DEFF Research Database (Denmark)

    Verleger, Rolf; Möller, Friderike; Kuniecki, Michal

    2010-01-01

    ) either as effective or as sham stimulation. In two experiments, either one of these two factors, hemisphere and effectiveness of rTMS, was varied within or between participants. Again, T2 was much better identified in the left than in the right visual field. This advantage of the left visual field......In the present task, series of visual stimuli are rapidly presented left and right, containing two target stimuli, T1 and T2. In previous studies, T2 was better identified in the left than in the right visual field. This advantage of the left visual field might reflect dominance exerted...... by the right over the left hemisphere. If so, then repetitive transcranial magnetic stimulation (rTMS) to the right parietal cortex might release the left hemisphere from right-hemispheric control, thereby improving T2 identification in the right visual field. Alternatively or additionally, the asymmetry in T2...

  3. PyContact: Rapid, Customizable, and Visual Analysis of Noncovalent Interactions in MD Simulations.

    Science.gov (United States)

    Scheurer, Maximilian; Rodenkirch, Peter; Siggel, Marc; Bernardi, Rafael C; Schulten, Klaus; Tajkhorshid, Emad; Rudack, Till

    2018-02-06

    Molecular dynamics (MD) simulations have become ubiquitous in all areas of life sciences. The size and model complexity of MD simulations are rapidly growing along with increasing computing power and improved algorithms. This growth has led to the production of a large amount of simulation data that need to be filtered for relevant information to address specific biomedical and biochemical questions. One of the most relevant molecular properties that can be investigated by all-atom MD simulations is the time-dependent evolution of the complex noncovalent interaction networks governing such fundamental aspects as molecular recognition, binding strength, and mechanical and structural stability. Extracting, evaluating, and visualizing noncovalent interactions is a key task in the daily work of structural biologists. We have developed PyContact, an easy-to-use, highly flexible, and intuitive graphical user interface-based application, designed to provide a toolkit to investigate biomolecular interactions in MD trajectories. PyContact is designed to facilitate this task by enabling identification of relevant noncovalent interactions in a comprehensible manner. The implementation of PyContact as a standalone application enables rapid analysis and data visualization without any additional programming requirements, and also preserves full in-program customization and extension capabilities for advanced users. The statistical analysis representation is interactively combined with full mapping of the results on the molecular system through the synergistic connection between PyContact and VMD. We showcase the capabilities and scientific significance of PyContact by analyzing and visualizing in great detail the noncovalent interactions underlying the ion permeation pathway of the human P2X 3 receptor. As a second application, we examine the protein-protein interaction network of the mechanically ultrastable cohesin-dockering complex. Copyright © 2017 Biophysical Society

  4. A Rapid Assessment of Instructional Strategies to Teach Auditory-Visual Conditional Discriminations to Children with Autism

    Science.gov (United States)

    Kodak, Tiffany; Clements, Andrea; LeBlanc, Brittany

    2013-01-01

    The purpose of the present investigation was to evaluate a rapid assessment procedure to identify effective instructional strategies to teach auditory-visual conditional discriminations to children diagnosed with autism. We replicated and extended previous rapid skills assessments (Lerman, Vorndran, Addison, & Kuhn, 2004) by evaluating the effects…

  5. New learning following reactivation in the human brain: targeting emotional memories through rapid serial visual presentation.

    Science.gov (United States)

    Wirkner, Janine; Löw, Andreas; Hamm, Alfons O; Weymar, Mathias

    2015-03-01

    Once reactivated, previously consolidated memories destabilize and have to be reconsolidated to persist, a process that might be altered non-invasively by interfering learning immediately after reactivation. Here, we investigated the influence of interference on brain correlates of reactivated episodic memories for emotional and neutral scenes using event-related potentials (ERPs). To selectively target emotional memories we applied a new reactivation method: rapid serial visual presentation (RSVP). RSVP leads to enhanced implicit processing (pop out) of the most salient memories making them vulnerable to disruption. In line, interference after reactivation of previously encoded pictures disrupted recollection particularly for emotional events. Furthermore, memory impairments were reflected in a reduced centro-parietal ERP old/new difference during retrieval of emotional pictures. These results provide neural evidence that emotional episodic memories in humans can be selectively altered through behavioral interference after reactivation, a finding with further clinical implications for the treatment of anxiety disorders. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Bilinear common spatial pattern for single-trial ERP-based rapid serial visual presentation triage

    Science.gov (United States)

    Yu, K.; Shen, K.; Shao, S.; Ng, W. C.; Li, X.

    2012-08-01

    Common spatial pattern (CSP) analysis is a useful tool for the feature extraction of event-related potentials (ERP). However, CSP is essentially time invariant, and thus unable to exploit the temporal information of ERP. This paper proposes a variant of CSP, namely bilinear common spatial pattern (BCSP), which is capable of accommodating both spatial and temporal information. BCSP generalizes CSP through iteratively optimizing bilinear filters. These bilinear filters constitute a spatio-temporal subspace in which the separation between two conditions is maximized. The method is unique in the sense that it is mathematically intuitive and simple, as all the bilinear filters are obtained by maximizing the power ratio as CSP does. The proposed method was evaluated on 20 subjects’ ERP data collected in rapid serial visual presentation triage experiments. The results show that BCSP achieved significantly higher average test accuracy (12.3% higher, p < 0.001).

  7. [Allocation of attentional resource and monitoring processes under rapid serial visual presentation].

    Science.gov (United States)

    Nishiura, K

    1998-08-01

    With the use of rapid serial visual presentation (RSVP), the present study investigated the cause of target intrusion errors and functioning of monitoring processes. Eighteen students participated in Experiment 1, and 24 in Experiment 2. In Experiment 1, different target intrusion errors were found depending on different kinds of letters --romaji, hiragana, and kanji. In Experiment 2, stimulus set size and context information were manipulated in an attempt to explore the cause of post-target intrusion errors. Results showed that as stimulus set size increased, the post-target intrusion errors also increased, but contextual information did not affect the errors. Results concerning mean report probability indicated that increased allocation of attentional resource to response-defining dimension was the cause of the errors. In addition, results concerning confidence rating showed that monitoring of temporal and contextual information was extremely accurate, but it was not so for stimulus information. These results suggest that attentional resource is different from monitoring resource.

  8. Rapid and sensitive detection of Didymella bryoniae by visual loop-mediated isothermal amplification assay

    Directory of Open Access Journals (Sweden)

    Xiefeng Yao

    2016-08-01

    Full Text Available Didymella bryoniae is a pathogenic fungus that causes gummy stem blight (GSB in Cucurbitaceae crops (e.g. cantaloupe, muskmelon, cucumber, and watermelon. GSB produces lesions on the stems and leaves, and can also be spread by seeds. Here, we developed a rapid, visual, and sensitive loop-mediated amplification (LAMP assay for D. bryoniae detection based on sequence-characterized amplified regions (GenBank accession nos GQ872461 and GQ872462 common to the two random amplification of polymorphic DNA group genotypes (RGI and RGII of D. bryoniae; ideal conditions for detection were optimized for completion in 45 min at 63°C. The sensitivity and specificity of the LAMP assay were further analyzed in comparison with those of a conventional polymerase chain reaction (PCR. The sensitivity of the LAMP assay was 1000-fold higher than that of conventional PCR with a detection limit of 0.1 fg μL−1 of targeted DNA. The LAMP assay could be accomplished in about 45 min, with the results visible to the naked eye. The assay showed high specificity in discriminating all D. bryoniae isolates from seven other fungal pathogens that occur in Cucurbitaceae crops. The LAMP assay also detected D. bryoniae infection in young muskmelon leaves with suspected early symptoms of GSB disease. Hence, the technique has great potential for developing rapid and sensitive visual detection methods for the D. bryoniae pathogen in crops and seeds. This method has potential application in early prediction of disease and reducing the risk of epidemics.

  9. Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP).

    Science.gov (United States)

    Acqualagna, Laura; Blankertz, Benjamin

    2013-05-01

    A Brain Computer Interface (BCI) speller is a communication device, which can be used by patients suffering from neurodegenerative diseases to select symbols in a computer application. For patients unable to overtly fixate the target symbol, it is crucial to develop a speller independent of gaze shifts. In the present online study, we investigated rapid serial visual presentation (RSVP) as a paradigm for mental typewriting. We investigated the RSVP speller in three conditions, regarding the Stimulus Onset Asynchrony (SOA) and the use of color features. A vocabulary of 30 symbols was presented one-by-one in a pseudo random sequence at the same location of display. All twelve participants were able to successfully operate the RSVP speller. The results show a mean online spelling rate of 1.43 symb/min and a mean symbol selection accuracy of 94.8% in the best condition. We conclude that the RSVP is a promising paradigm for BCI spelling and its performance is competitive with the fastest gaze-independent spellers in literature. The RSVP speller does not require gaze shifts towards different target locations and can be operated by non-spatial visual attention, therefore it can be considered as a valid paradigm in applications with patients for impaired oculo-motor control. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  10. Images from the Mind: BCI image reconstruction based on Rapid Serial Visual Presentations of polygon primitives

    Directory of Open Access Journals (Sweden)

    Luís F Seoane

    2015-04-01

    Full Text Available We provide a proof of concept for an EEG-based reconstruction of a visual image which is on a user's mind. Our approach is based on the Rapid Serial Visual Presentation (RSVP of polygon primitives and Brain-Computer Interface (BCI technology. In an experimental setup, subjects were presented bursts of polygons: some of them contributed to building a target image (because they matched the shape and/or color of the target while some of them did not. The presentation of the contributing polygons triggered attention-related EEG patterns. These Event Related Potentials (ERPs could be determined using BCI classification and could be matched to the stimuli that elicited them. These stimuli (i.e. the ERP-correlated polygons were accumulated in the display until a satisfactory reconstruction of the target image was reached. As more polygons were accumulated, finer visual details were attained resulting in more challenging classification tasks. In our experiments, we observe an average classification accuracy of around 75%. An in-depth investigation suggests that many of the misclassifications were not misinterpretations of the BCI concerning the users' intent, but rather caused by ambiguous polygons that could contribute to reconstruct several different images. When we put our BCI-image reconstruction in perspective with other RSVP BCI paradigms, there is large room for improvement both in speed and accuracy. These results invite us to be optimistic. They open a plethora of possibilities to explore non-invasive BCIs for image reconstruction both in healthy and impaired subjects and, accordingly, suggest interesting recreational and clinical applications.

  11. Visual loop-mediated isothermal amplification (LAMP) for the rapid diagnosis of Enterocytozoon hepatopenaei (EHP) infection.

    Science.gov (United States)

    T, Sathish Kumar; A, Navaneeth Krishnan; J, Joseph Sahaya Rajan; M, Makesh; K P, Jithendran; S V, Alavandi; K K, Vijayan

    2018-05-01

    The emerging microsporidian parasite Enterocytozoon hepatopenaei (EHP), the causative agent of hepatopancreatic microsporidiosis, has been widely reported in shrimp-farming countries. EHP infection can be detected by light microscopy observation of spores (1.7 × 1 μm) in stained hepatopancreas (HP) tissue smears, HP tissue sections, and fecal samples. EHP can also be detected by polymerase chain reaction (PCR) targeting the small subunit (SSU) ribosomal RNA (rRNA) gene or the spore wall protein gene (SWP). In this study, a rapid, sensitive, specific, and closed tube visual loop-mediated isothermal amplification (LAMP) protocol combined with FTA cards was developed for the diagnosis of EHP. LAMP primers were designed based on the SSU rRNA gene of EHP. The target sequence of EHP was amplified at constant temperature of 65 °C for 45 min and amplified LAMP products were visually detected in a closed tube system by using SYBR™ green I dye. Detection limit of this LAMP protocol was ten copies. Field and clinical applicability of this assay was evaluated using 162 field samples including 106 HP tissue samples and 56 fecal samples collected from shrimp farms. Out of 162 samples, EHP could be detected in 62 samples (47 HP samples and 15 fecal samples). When compared with SWP-PCR as the gold standard, this EHP LAMP assay had 95.31% sensitivity, 98.98% specificity, and a kappa value of 0.948. This simple, closed tube, clinically evaluated visual LAMP assay has great potential for diagnosing EHP at the farm level, particularly under low-resource circumstances.

  12. Detection of Bar Transgenic Sugarcane with a Rapid and Visual Loop-Mediated Isothermal Amplification Assay.

    Science.gov (United States)

    Zhou, Dinggang; Wang, Chunfeng; Li, Zhu; Chen, Yun; Gao, Shiwu; Guo, Jinlong; Lu, Wenying; Su, Yachun; Xu, Liping; Que, Youxiong

    2016-01-01

    Genetic engineering offers an attractive alternative in sugarcane breeding for increasing cane and sugar yields as well as disease and insect resistance. Bar transgenic sugarcane employing the herbicide tolerance is a useful agronomical trait in weed control. In this study, a loop-mediated isothermal amplification (LAMP) assay for rapid detection of the bar gene in transgenic sugarcane has been developed and evaluated. A set of six primers was designed for LAMP-based amplification of the bar gene. The LAMP reaction conditions were optimized as follows: 5.25 mM of Mg(2+), 6:1 ratio of inner vs. outer primer, and 6.0 U of Bst DNA polymerase in a reaction volume of 25.0 μL. The detection limit of the recombinant plasmid 1Ac0229 was as low as 10 copies in the developed LAMP, which was 10-fold higher sensitive than that of conventional PCR. In 100 putative transgenic lines, the bar gene was detected in 100/100 cases (100%) by LAMP and 97/100 cases (97%) by conventional PCR, respectively. In conclusion, the developed LAMP assay is visual, rapid, sensitive, reliable, and cost-effective for detection of the bar specific transgenic sugarcane.

  13. Emotional noun processing: an ERP study with rapid serial visual presentation.

    Directory of Open Access Journals (Sweden)

    Shengnan Yi

    Full Text Available Reading is an important part of our daily life, and rapid responses to emotional words have received a great deal of research interest. Our study employed rapid serial visual presentation to detect the time course of emotional noun processing using event-related potentials. We performed a dual-task experiment, where subjects were required to judge whether a given number was odd or even, and the category into which each emotional noun fit. In terms of P1, we found that there was no negativity bias for emotional nouns. However, emotional nouns elicited larger amplitudes in the N170 component in the left hemisphere than did neutral nouns. This finding indicated that in later processing stages, emotional words can be discriminated from neutral words. Furthermore, positive, negative, and neutral words were different from each other in the late positive complex, indicating that in the third stage, even different emotions can be discerned. Thus, our results indicate that in a three-stage model the latter two stages are more stable and universal.

  14. Emotional noun processing: an ERP study with rapid serial visual presentation.

    Science.gov (United States)

    Yi, Shengnan; He, Weiqi; Zhan, Lei; Qi, Zhengyang; Zhu, Chuanlin; Luo, Wenbo; Li, Hong

    2015-01-01

    Reading is an important part of our daily life, and rapid responses to emotional words have received a great deal of research interest. Our study employed rapid serial visual presentation to detect the time course of emotional noun processing using event-related potentials. We performed a dual-task experiment, where subjects were required to judge whether a given number was odd or even, and the category into which each emotional noun fit. In terms of P1, we found that there was no negativity bias for emotional nouns. However, emotional nouns elicited larger amplitudes in the N170 component in the left hemisphere than did neutral nouns. This finding indicated that in later processing stages, emotional words can be discriminated from neutral words. Furthermore, positive, negative, and neutral words were different from each other in the late positive complex, indicating that in the third stage, even different emotions can be discerned. Thus, our results indicate that in a three-stage model the latter two stages are more stable and universal.

  15. Detection of bar transgenic sugarcane with a rapid and visual loop-mediated isothermal amplification assay

    Directory of Open Access Journals (Sweden)

    Dinggang eZhou

    2016-03-01

    Full Text Available Genetic engineering offers an attractive alternative in sugarcane breeding for increasing cane and sugar yields as well as disease and insect resistance. Bar transgenic sugarcane employing the herbicide tolerance is a useful agronomical trait in weed control. In this study, a loop-mediated isothermal amplification (LAMP assay for rapid detection of the bar gene in transgenic sugarcane has been developed and evaluated. A set of six primers was designed for LAMP-based amplification of the bar gene. The LAMP reaction conditions were optimized as follows: 5.25 mM of Mg2+, 6:1 ratio of inner vs outer primer, and 6.0 U of Bst DNA polymerase in a reaction volume of 25.0 μL. The detection limit of the recombinant plasmid 1Ac0229 was as low as 10 copies in the developed LAMP, which was ten-fold higher sensitive than that of conventional PCR. In 100 putative transgenic lines, the bar gene was detected in 100/100 cases (100% by LAMP and 97/100 cases (97% by conventional PCR, respectively. In conclusion, the developed LAMP assay is visual, rapid, sensitive, reliable and cost-effective for detection of the bar specific transgenic sugarcane.

  16. Adjustment to subtle time constraints and power law learning in rapid serial visual presentation

    Directory of Open Access Journals (Sweden)

    Jacqueline Chakyung Shin

    2015-11-01

    Full Text Available We investigated whether attention could be modulated through the implicit learning of temporal information in a rapid serial visual presentation (RSVP task. Participants identified two target letters among numeral distractors. The stimulus-onset asynchrony immediately following the first target (SOA1 varied at three levels (70, 98, and 126 ms randomly between trials or fixed within blocks of trials. Practice over three consecutive days resulted in a continuous improvement in the identification rate for both targets and attenuation of the attentional blink (AB, a decrement in target (T2 identification when presented 200-400 ms after another target (T1. Blocked SOA1s led to a faster rate of improvement in RSVP performance and more target order reversals relative to random SOA1s, suggesting that the implicit learning of SOA1 positively affected performance. The results also reveal power law learning curves for individual target identification as well as the reduction in the AB decrement. These learning curves reflect the spontaneous emergence of skill through subtle attentional modulations rather than general attentional distribution. Together, the results indicate that implicit temporal learning could improve high level and rapid cognitive processing and highlights the sensitivity and adaptability of the attentional system to subtle constraints in stimulus timing.

  17. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  18. Fat Content Modulates Rapid Detection of Food: A Visual Search Study Using Fast Food and Japanese Diet

    OpenAIRE

    Sawada, Reiko; Sato, Wataru; Toichi, Motomi; Fushiki, Tohru

    2017-01-01

    Rapid detection of food is crucial for the survival of organisms. However, previous visual search studies have reported discrepant results regarding the detection speeds for food vs. non-food items; some experiments showed faster detection of food than non-food, whereas others reported null findings concerning any speed advantage for the detection of food vs. non-food. Moreover, although some previous studies showed that fat content can affect visual attention for food, the effect of fat cont...

  19. Improved Neural Signal Classification in a Rapid Serial Visual Presentation Task Using Active Learning.

    Science.gov (United States)

    Marathe, Amar R; Lawhern, Vernon J; Wu, Dongrui; Slayback, David; Lance, Brent J

    2016-03-01

    The application space for brain-computer interface (BCI) technologies is rapidly expanding with improvements in technology. However, most real-time BCIs require extensive individualized calibration prior to use, and systems often have to be recalibrated to account for changes in the neural signals due to a variety of factors including changes in human state, the surrounding environment, and task conditions. Novel approaches to reduce calibration time or effort will dramatically improve the usability of BCI systems. Active Learning (AL) is an iterative semi-supervised learning technique for learning in situations in which data may be abundant, but labels for the data are difficult or expensive to obtain. In this paper, we apply AL to a simulated BCI system for target identification using data from a rapid serial visual presentation (RSVP) paradigm to minimize the amount of training samples needed to initially calibrate a neural classifier. Our results show AL can produce similar overall classification accuracy with significantly less labeled data (in some cases less than 20%) when compared to alternative calibration approaches. In fact, AL classification performance matches performance of 10-fold cross-validation (CV) in over 70% of subjects when training with less than 50% of the data. To our knowledge, this is the first work to demonstrate the use of AL for offline electroencephalography (EEG) calibration in a simulated BCI paradigm. While AL itself is not often amenable for use in real-time systems, this work opens the door to alternative AL-like systems that are more amenable for BCI applications and thus enables future efforts for developing highly adaptive BCI systems.

  20. Prioritized Identification of Attractive and Romantic Partner Faces in Rapid Serial Visual Presentation.

    Science.gov (United States)

    Nakamura, Koyo; Arai, Shihoko; Kawabata, Hideaki

    2017-11-01

    People are sensitive to facial attractiveness because it is an important biological and social signal. As such, our perceptual and attentional system seems biased toward attractive faces. We tested whether attractive faces capture attention and enhance memory access in an involuntary manner using a dual-task rapid serial visual presentation (dtRSVP) paradigm, wherein multiple faces were successively presented for 120 ms. In Experiment 1, participants (N = 26) were required to identify two female faces embedded in a stream of animal faces as distractors. The results revealed that identification of the second female target (T2) was better when it was attractive compared to neutral or unattractive. In Experiment 2, we investigated whether perceived attractiveness affects T2 identification (N = 27). To this end, we performed another dtRSVP task involving participants in a romantic partnership with the opposite sex, wherein T2 was their romantic partner's face. The results demonstrated that a romantic partner's face was correctly identified more often than was the face of a friend or unknown person. Furthermore, the greater the intensity of passionate love participants felt for their partner (as measured by the Passionate Love Scale), the more often they correctly identified their partner's face. Our experiments indicate that attractive and romantic partners' faces facilitate the identification of the faces in an involuntary manner.

  1. The Onset and Time Course of Semantic Priming during Rapid Recognition of Visual Words

    Science.gov (United States)

    Hoedemaker, Renske S.; Gordon, Peter C.

    2016-01-01

    In two experiments, we assessed the effects of response latency and task-induced goals on the onset and time course of semantic priming during rapid processing of visual words as revealed by ocular response tasks. In Experiment 1 (Ocular Lexical Decision Task), participants performed a lexical decision task using eye-movement responses on a sequence of four words. In Experiment 2, the same words were encoded for an episodic recognition memory task that did not require a meta-linguistic judgment. For both tasks, survival analyses showed that the earliest-observable effect (Divergence Point or DP) of semantic priming on target-word reading times occurred at approximately 260 ms, and ex-Gaussian distribution fits revealed that the magnitude of the priming effect increased as a function of response time. Together, these distributional effects of semantic priming suggest that the influence of the prime increases when target processing is more effortful. This effect does not require that the task include a metalinguistic judgment; manipulation of the task goals across experiments affected the overall response speed but not the location of the DP or the overall distributional pattern of the priming effect. These results are more readily explained as the result of a retrospective rather than a prospective priming mechanism and are consistent with compound-cue models of semantic priming. PMID:28230394

  2. Convolutional Neural Network for Multi-Category Rapid Serial Visual Presentation BCI

    Directory of Open Access Journals (Sweden)

    Ran eManor

    2015-12-01

    Full Text Available Brain computer interfaces rely on machine learning algorithms to decode the brain's electrical activity into decisions. For example, in rapid serial visual presentation (RSVP tasks, the subject is presented with a continuous stream of images containing rare target images among standard images, while the algorithm has to detect brain activity associated with target images. Here, we continue our previous work, presenting a deep neural network model for the use of single trial EEG classification in RSVP tasks. Deep neural networks have shown state of the art performance in computer vision and speech recognition and thus have great promise for other learning tasks, like classification of EEG samples. In our model, we introduce a novel spatio-temporal regularization for EEG data to reduce overfitting. We show improved classification performance compared to our earlier work on a five categories RSVP experiment. In addition, we compare performance on data from different sessions and validate the model on a public benchmark data set of a P300 speller task. Finally, we discuss the advantages of using neural network models compared to manually designing feature extraction algorithms.

  3. Rapid assessment of visual impairment (RAVI in marine fishing communities in South India - study protocol and main findings

    Directory of Open Access Journals (Sweden)

    Madala Sreenivas R

    2011-09-01

    Full Text Available Abstract Background Reliable data are a pre-requisite for planning eye care services. Though conventional cross sectional studies provide reliable information, they are resource intensive. A novel rapid assessment method was used to investigate the prevalence and causes of visual impairment and presbyopia in subjects aged 40 years and older. This paper describes the detailed methodology and study procedures of Rapid Assessment of Visual Impairment (RAVI project. Methods A population-based cross-sectional study was conducted using cluster random sampling in the coastal region of Prakasam district of Andhra Pradesh in India, predominantly inhabited by fishing communities. Unaided, aided and pinhole visual acuity (VA was assessed using a Snellen chart at a distance of 6 meters. The VA was re-assessed using a pinhole, if VA was Results The data collection was completed in Conclusion There is a high prevalence of visual impairment in marine fishing communities in Prakasam district in India. The data from this rapid assessment survey can now be used as a baseline to start eye care services in this region. The rapid assessment methodology (RAVI reported in this paper is robust, quick and has the potential to be replicated in other areas.

  4. Neural Correlates of Word Recognition: A Systematic Comparison of Natural Reading and Rapid Serial Visual Presentation.

    Science.gov (United States)

    Kornrumpf, Benthe; Niefind, Florian; Sommer, Werner; Dimigen, Olaf

    2016-09-01

    Neural correlates of word recognition are commonly studied with (rapid) serial visual presentation (RSVP), a condition that eliminates three fundamental properties of natural reading: parafoveal preprocessing, saccade execution, and the fast changes in attentional processing load occurring from fixation to fixation. We combined eye-tracking and EEG to systematically investigate the impact of all three factors on brain-electric activity during reading. Participants read lists of words either actively with eye movements (eliciting fixation-related potentials) or maintained fixation while the text moved passively through foveal vision at a matched pace (RSVP-with-flankers paradigm, eliciting ERPs). The preview of the upcoming word was manipulated by changing the number of parafoveally visible letters. Processing load was varied by presenting words of varying lexical frequency. We found that all three factors have strong interactive effects on the brain's responses to words: Once a word was fixated, occipitotemporal N1 amplitude decreased monotonically with the amount of parafoveal information available during the preceding fixation; hence, the N1 component was markedly attenuated under reading conditions with preview. Importantly, this preview effect was substantially larger during active reading (with saccades) than during passive RSVP with flankers, suggesting that the execution of eye movements facilitates word recognition by increasing parafoveal preprocessing. Lastly, we found that the N1 component elicited by a word also reflects the lexical processing load imposed by the previously inspected word. Together, these results demonstrate that, under more natural conditions, words are recognized in a spatiotemporally distributed and interdependent manner across multiple eye fixations, a process that is mediated by active motor behavior.

  5. Development and utilization of USGS ShakeCast for rapid post-earthquake assessment of critical facilities and infrastructure

    Science.gov (United States)

    Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel

    2017-01-01

    The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for

  6. Temporal limits of selection and memory encoding: A comparison of whole versus partial report in rapid serial visual presentation.

    Science.gov (United States)

    Nieuwenstein, Mark R; Potter, Mary C

    2006-06-01

    People often fail to recall the second of two visual targets presented within 500 ms in rapid serial visual presentation (RSVP). This effect is called the attentional blink. One explanation of the attentional blink is that processes involved in encoding the first target into memory are slow and capacity limited. Here, however, we show that the attentional blink should be ascribed to attentional selection, not consolidation of the first target. Rapid sequences of six letters were presented, and observers had to report either all the letters (whole-report condition) or a subset of the letters (partial-report condition). Selection in partial report was based on color (e.g., report the two red letters) or identity (i.e., report all letters from a particular letter onward). In both cases, recall of letters presented shortly after the first selected letter was impaired, whereas recall of the corresponding letters was relatively accurate with whole report.

  7. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study

    OpenAIRE

    Neale, Chris; Johnston, Patrick; Hughes, Matthew; Scholey, Andrew

    2015-01-01

    The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore...

  8. A simple and rapid visual method for the determination of ammonia nitrogen in environmental waters using thymol

    Energy Technology Data Exchange (ETDEWEB)

    Okumura, M.; Fujinaga, K.; Seike, Y.; Honda, S. [Dept. of Material Science, Interdisciplinary Faculty of Science and Engineering, Shimane University, Matsue (Japan)

    1999-11-01

    Simple visual and spectrophotometric methods for the determination of ammonia nitrogen in water are proposed, based on the color development of indothymol blue formed between ammonia and thymol. The color development was accelerated by nitroprusside to complete in 3 min. This color development is remarkably rapid compared with that of the other conventional methods with indothymol blue and indophenol blue. The concentration range of ammonia nitrogen spectrophotometrically determined was 0.04-1.2 mg/L NH{sub 4}-N. The absorbance per 1 {mu}g NH{sub 4}-N was 0.0215 (molar absorptivity = 1.51 x 10{sup 4}) at 690 nm. The visual method not using any instrument as an in situ method in field works was developed based on the optimum conditions for the established spectrophotometric method. This visual method was successfully applied to the determination of ammonia nitrogen in environmental waters. (orig.)

  9. Fat Content Modulates Rapid Detection of Food: A Visual Search Study Using Fast Food and Japanese Diet.

    Science.gov (United States)

    Sawada, Reiko; Sato, Wataru; Toichi, Motomi; Fushiki, Tohru

    2017-01-01

    Rapid detection of food is crucial for the survival of organisms. However, previous visual search studies have reported discrepant results regarding the detection speeds for food vs. non-food items; some experiments showed faster detection of food than non-food, whereas others reported null findings concerning any speed advantage for the detection of food vs. non-food. Moreover, although some previous studies showed that fat content can affect visual attention for food, the effect of fat content on the detection of food remains unclear. To investigate these issues, we measured reaction times (RTs) during a visual search task in which participants with normal weight detected high-fat food (i.e., fast food), low-fat food (i.e., Japanese diet), and non-food (i.e., kitchen utensils) targets within crowds of non-food distractors (i.e., cars). Results showed that RTs for food targets were shorter than those for non-food targets. Moreover, the RTs for high-fat food were shorter than those for low-fat food. These results suggest that food is more rapidly detected than non-food within the environment and that a higher fat content in food facilitates rapid detection.

  10. Fat Content Modulates Rapid Detection of Food: A Visual Search Study Using Fast Food and Japanese Diet

    Directory of Open Access Journals (Sweden)

    Reiko Sawada

    2017-06-01

    Full Text Available Rapid detection of food is crucial for the survival of organisms. However, previous visual search studies have reported discrepant results regarding the detection speeds for food vs. non-food items; some experiments showed faster detection of food than non-food, whereas others reported null findings concerning any speed advantage for the detection of food vs. non-food. Moreover, although some previous studies showed that fat content can affect visual attention for food, the effect of fat content on the detection of food remains unclear. To investigate these issues, we measured reaction times (RTs during a visual search task in which participants with normal weight detected high-fat food (i.e., fast food, low-fat food (i.e., Japanese diet, and non-food (i.e., kitchen utensils targets within crowds of non-food distractors (i.e., cars. Results showed that RTs for food targets were shorter than those for non-food targets. Moreover, the RTs for high-fat food were shorter than those for low-fat food. These results suggest that food is more rapidly detected than non-food within the environment and that a higher fat content in food facilitates rapid detection.

  11. Fat Content Modulates Rapid Detection of Food: A Visual Search Study Using Fast Food and Japanese Diet

    Science.gov (United States)

    Sawada, Reiko; Sato, Wataru; Toichi, Motomi; Fushiki, Tohru

    2017-01-01

    Rapid detection of food is crucial for the survival of organisms. However, previous visual search studies have reported discrepant results regarding the detection speeds for food vs. non-food items; some experiments showed faster detection of food than non-food, whereas others reported null findings concerning any speed advantage for the detection of food vs. non-food. Moreover, although some previous studies showed that fat content can affect visual attention for food, the effect of fat content on the detection of food remains unclear. To investigate these issues, we measured reaction times (RTs) during a visual search task in which participants with normal weight detected high-fat food (i.e., fast food), low-fat food (i.e., Japanese diet), and non-food (i.e., kitchen utensils) targets within crowds of non-food distractors (i.e., cars). Results showed that RTs for food targets were shorter than those for non-food targets. Moreover, the RTs for high-fat food were shorter than those for low-fat food. These results suggest that food is more rapidly detected than non-food within the environment and that a higher fat content in food facilitates rapid detection. PMID:28690568

  12. The Advanced Rapid Imaging and Analysis (ARIA) Project: Status of SAR products for Earthquakes, Floods, Volcanoes and Groundwater-related Subsidence

    Science.gov (United States)

    Owen, S. E.; Yun, S. H.; Hua, H.; Agram, P. S.; Liu, Z.; Sacco, G. F.; Manipon, G.; Linick, J. P.; Fielding, E. J.; Lundgren, P.; Farr, T. G.; Webb, F.; Rosen, P. A.; Simons, M.

    2017-12-01

    The Advanced Rapid Imaging and Analysis (ARIA) project for Natural Hazards is focused on rapidly generating high-level geodetic imaging products and placing them in the hands of the solid earth science and local, national, and international natural hazard communities by providing science product generation, exploration, and delivery capabilities at an operational level. Space-based geodetic measurement techniques including Interferometric Synthetic Aperture Radar (InSAR), differential Global Positioning System, and SAR-based change detection have become critical additions to our toolset for understanding and mapping the damage and deformation caused by earthquakes, volcanic eruptions, floods, landslides, and groundwater extraction. Up until recently, processing of these data sets has been handcrafted for each study or event and has not generated products rapidly and reliably enough for response to natural disasters or for timely analysis of large data sets. The ARIA project, a joint venture co-sponsored by the California Institute of Technology and by NASA through the Jet Propulsion Laboratory, has been capturing the knowledge applied to these responses and building it into an automated infrastructure to generate imaging products in near real-time that can improve situational awareness for disaster response. In addition to supporting the growing science and hazard response communities, the ARIA project has developed the capabilities to provide automated imaging and analysis capabilities necessary to keep up with the influx of raw SAR data from geodetic imaging missions such as ESA's Sentinel-1A/B, now operating with repeat intervals as short as 6 days, and the upcoming NASA NISAR mission. We will present the progress and results we have made on automating the analysis of Sentinel-1A/B SAR data for hazard monitoring and response, with emphasis on recent developments and end user engagement in flood extent mapping and deformation time series for both volcano

  13. EzMol: A Web Server Wizard for the Rapid Visualization and Image Production of Protein and Nucleic Acid Structures.

    Science.gov (United States)

    Reynolds, Christopher R; Islam, Suhail A; Sternberg, Michael J E

    2018-01-31

    EzMol is a molecular visualization Web server in the form of a software wizard, located at http://www.sbg.bio.ic.ac.uk/ezmol/. It is designed for easy and rapid image manipulation and display of protein molecules, and is intended for users who need to quickly produce high-resolution images of protein molecules but do not have the time or inclination to use a software molecular visualization system. EzMol allows the upload of molecular structure files in PDB format to generate a Web page including a representation of the structure that the user can manipulate. EzMol provides intuitive options for chain display, adjusting the color/transparency of residues, side chains and protein surfaces, and for adding labels to residues. The final adjusted protein image can then be downloaded as a high-resolution image. There are a range of applications for rapid protein display, including the illustration of specific areas of a protein structure and the rapid prototyping of images. Copyright © 2018. Published by Elsevier Ltd.

  14. A Unique Role of Endogenous Visual-Spatial Attention in Rapid Processing of Multiple Targets

    Science.gov (United States)

    Guzman-Martinez, Emmanuel; Grabowecky, Marcia; Palafox, German; Suzuki, Satoru

    2011-01-01

    Visual spatial attention can be exogenously captured by a salient stimulus or can be endogenously allocated by voluntary effort. Whether these two attention modes serve distinctive functions is debated, but for processing of single targets the literature suggests superiority of exogenous attention (it is faster acting and serves more functions).…

  15. TacTool: a tactile rapid prototyping tool for visual interfaces

    NARCIS (Netherlands)

    Keyson, D.V.; Tang, H.K.; Anzai, Y.; Ogawa, K.; Mori, H.

    1995-01-01

    This paper describes the TacTool development tool and input device for designing and evaluating visual user interfaces with tactile feedback. TacTool is currently supported by the IPO trackball with force feedback in the x and y directions. The tool is designed to enable both the designer and the

  16. Rapid Seismic Deployment for Capturing Aftershocks of the September 2017 Tehuantepec, Mexico (M=8.1) and Morelos-Puebla (M=7.1), Mexico Earthquakes

    Science.gov (United States)

    Velasco, A. A.; Karplus, M. S.; Dena, O.; Gonzalez-Huizar, H.; Husker, A. L.; Perez-Campos, X.; Calo, M.; Valdes, C. M.

    2017-12-01

    The September 7 Tehuantepec, Mexico (M=8.1) and the September 19 Morelos-Puebla, Mexico (M=7.1) earthquakes ruptured with extensional faulting within the Cocos Plate at 70-km and 50-km depth, as it subducts beneath the continental North American Plate. Both earthquakes caused significant damage and loss of life. These events were followed by a M=6.1 extensional earthquake at only 10-km depth in Oaxaca on September 23, 2017. While the Morelos-Puebla earthquake was likely too far away to be statically triggered by the Tehuantepec earthquake, initial Coulomb stress analyses show that the M=6.1 event may have been an aftershock of the Tehuantepec earthquake. Many questions remain about these earthquakes, including: Did the Cocos Plate earthquakes load the upper plate, and could they possibly trigger an equal or larger earthquake on the plate interface? Are these the result of plate bending? Do the aftershocks migrate to the locked zone in the subduction zone? Why did the intermediate depth earthquakes create so much damage? Are these earthquakes linked by dynamic stresses? Is it possible that a potential slow-slip event triggered both events? To address some of these questions, we deployed 10 broadband seismometers near the epicenter of the Tehuantepec, Mexico earthquake and 51 UTEP-owned nodes (5-Hz, 3-component geophones) to record aftershocks and augment networks deployed by the Universidad Nacional Autónoma de México (UNAM). The 10 broadband instruments will be deployed for 6 months, while the nodes were deployed 25 days. The relative ease-of-deployment and larger numbers of the nodes allowed us to deploy them quickly in the area near the M=6.1 Oaxaca earthquake, just a few days after that earthquake struck. We deployed them near the heavily-damaged cities of Juchitan, Ixtaltepec, and Ixtepec as well as in Tehuantepec and Salina Cruz, Oaxaca in order to test their capabilities for site characterization and aftershock studies. This is the first test of these

  17. The Harvest suite for rapid core-genome alignment and visualization of thousands of intraspecific microbial genomes.

    Science.gov (United States)

    Treangen, Todd J; Ondov, Brian D; Koren, Sergey; Phillippy, Adam M

    2014-01-01

    Whole-genome sequences are now available for many microbial species and clades, however existing whole-genome alignment methods are limited in their ability to perform sequence comparisons of multiple sequences simultaneously. Here we present the Harvest suite of core-genome alignment and visualization tools for the rapid and simultaneous analysis of thousands of intraspecific microbial strains. Harvest includes Parsnp, a fast core-genome multi-aligner, and Gingr, a dynamic visual platform. Together they provide interactive core-genome alignments, variant calls, recombination detection, and phylogenetic trees. Using simulated and real data we demonstrate that our approach exhibits unrivaled speed while maintaining the accuracy of existing methods. The Harvest suite is open-source and freely available from: http://github.com/marbl/harvest.

  18. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  19. Rapid visual grouping and figure-ground processing using temporally structured displays.

    Science.gov (United States)

    Cheadle, Samuel; Usher, Marius; Müller, Hermann J

    2010-08-23

    We examine the time course of visual grouping and figure-ground processing. Figure (contour) and ground (random-texture) elements were flickered with different phases (i.e., contour and background are alternated), requiring the observer to group information within a pre-specified time window. It was found this grouping has a high temporal resolution: less than 20ms for smooth contours, and less than 50ms for line conjunctions with sharp angles. Furthermore, the grouping process takes place without an explicit knowledge of the phase of the elements, and it requires a cumulative build-up of information. The results are discussed in relation to the neural mechanism for visual grouping and figure-ground segregation. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. SimITK: rapid ITK prototyping using the Simulink visual programming environment

    Science.gov (United States)

    Dickinson, A. W. L.; Mousavi, P.; Gobbi, D. G.; Abolmaesumi, P.

    2011-03-01

    The Insight Segmentation and Registration Toolkit (ITK) is a long-established, software package used for image analysis, visualization, and image-guided surgery applications. This package is a collection of C++ libraries, that can pose usability problems for users without C++ programming experience. To bridge the gap between the programming complexities and the required learning curve of ITK, we present a higher-level visual programming environment that represents ITK methods and classes by wrapping them into "blocks" within MATLAB's visual programming environment, Simulink. These blocks can be connected to form workflows: visual schematics that closely represent the structure of a C++ program. Due to the heavily C++ templated nature of ITK, direct interaction between Simulink and ITK requires an intermediary to convert their respective datatypes and allow intercommunication. We have developed a "Virtual Block" that serves as an intermediate wrapper around the ITK class and is responsible for resolving the templated datatypes used by ITK to native types used by Simulink. Presently, the wrapping procedure for SimITK is semi-automatic in that it requires XML descriptions of the ITK classes as a starting point, as this data is used to create all other necessary integration files. The generation of all source code and object code from the XML is done automatically by a CMake build script that yields Simulink blocks as the final result. An example 3D segmentation workflow using cranial-CT data as well as a 3D MR-to-CT registration workflow are presented as a proof-of-concept.

  1. Fluorescence Imaging and Streamline Visualization of Hypersonic Flow over Rapid Prototype Wind-Tunnel Models

    Science.gov (United States)

    Danehy, Paul M.; Alderfer, David W.; Inman, Jennifer A.; Berger, Karen T.; Buck, Gregory M.; Schwartz, Richard J.

    2008-01-01

    Reentry models for use in hypersonic wind tunnel tests were fabricated using a stereolithography apparatus. These models were produced in one day or less, which is a significant time savings compared to the manufacture of ceramic or metal models. The models were tested in the NASA Langley Research Center 31-Inch Mach 10 Air Tunnel. Only a few of the models survived repeated tests in the tunnel, and several failure modes of the models were identified. Planar laser-induced fluorescence (PLIF) of nitric oxide (NO) was used to visualize the flowfields in the wakes of these models. Pure NO was either seeded through tubes plumbed into the model or via a tube attached to the strut holding the model, which provided localized addition of NO into the model s wake through a porous metal cylinder attached to the end of the tube. Models included several 2- inch diameter Inflatable Reentry Vehicle Experiment (IRVE) models and 5-inch diameter Crew Exploration Vehicle (CEV) models. Various model configurations and NO seeding methods were used, including a new streamwise visualization method based on PLIF. Virtual Diagnostics Interface (ViDI) technology, developed at NASA Langley Research Center, was used to visualize the data sets in post processing. The use of calibration "dotcards" was investigated to correct for camera perspective and lens distortions in the PLIF images.

  2. Rapid Forgetting Results from Competition over Time between Items in Visual Working Memory

    Science.gov (United States)

    Pertzov, Yoni; Manohar, Sanjay; Husain, Masud

    2017-01-01

    Working memory is now established as a fundamental cognitive process across a range of species. Loss of information held in working memory has the potential to disrupt many aspects of cognitive function. However, despite its significance, the mechanisms underlying rapid forgetting remain unclear, with intense recent debate as to whether it is…

  3. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  4. Phoneme Awareness, Visual-Verbal Paired-Associate Learning, and Rapid Automatized Naming as Predictors of Individual Differences in Reading Ability

    Science.gov (United States)

    Warmington, Meesha; Hulme, Charles

    2012-01-01

    This study examines the concurrent relationships between phoneme awareness, visual-verbal paired-associate learning, rapid automatized naming (RAN), and reading skills in 7- to 11-year-old children. Path analyses showed that visual-verbal paired-associate learning and RAN, but not phoneme awareness, were unique predictors of word recognition,…

  5. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  6. Lateralization of spatial rather than temporal attention underlies the left hemifield advantage in rapid serial visual presentation.

    Science.gov (United States)

    Asanowicz, Dariusz; Kruse, Lena; Śmigasiewicz, Kamila; Verleger, Rolf

    2017-11-01

    In bilateral rapid serial visual presentation (RSVP), the second of two targets, T1 and T2, is better identified in the left visual field (LVF) than in the right visual field (RVF). This LVF advantage may reflect hemispheric asymmetry in temporal attention or/and in spatial orienting of attention. Participants performed two tasks: the "standard" bilateral RSVP task (Exp.1) and its unilateral variant (Exp.1 & 2). In the bilateral task, spatial location was uncertain, thus target identification involved stimulus-driven spatial orienting. In the unilateral task, the targets were presented block-wise in the LVF or RVF only, such that no spatial orienting was needed for target identification. Temporal attention was manipulated in both tasks by varying the T1-T2 lag. The results showed that the LVF advantage disappeared when involvement of stimulus-driven spatial orienting was eliminated, whereas the manipulation of temporal attention had no effect on the asymmetry. In conclusion, the results do not support the hypothesis of hemispheric asymmetry in temporal attention, and provide further evidence that the LVF advantage reflects right hemisphere predominance in stimulus-driven orienting of spatial attention. These conclusions fit evidence that temporal attention is implemented by bilateral parietal areas and spatial attention by the right-lateralized ventral frontoparietal network. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Review of the investigation of mixture formation and combustion process using rapid compression machine and direct visualization system

    Science.gov (United States)

    Jaat, M.; Khalid, Amir; Manshoor, B.; Ramsy, Him

    2013-12-01

    This paper reviews of some applications of optical visualization systems to compute the fuel-air mixing process during early stage of mixture formation in Diesel Combustion Engines. A number of studies have contributed to the understanding of fuel air mixing in DI diesel engine. This review has shown that the mixture formation process affects initial flame development. The review also found that injection pressure has a great effect on the mixture formation then the flame development and combustion characteristics. The method of the simulation of real phenomenon of diesel combustion with optical access rapid compression machine is also reviewed and experimental results are presented. The application of these methods to the investigation of diesel sprays highlights mechanisms which govern propagation and distribution of the formation of a combustible fuel-air mixture. A summary of the implementation of constant volume chamber and optical visualization system are shown in the accompanying tables and figures. The visualization of the formation process of diesel spray and its combustion in the diesel combustion chamber of diesel engine has been recognized as one of the best ways to understand the characteristics of the mixture formation.

  8. A simple and rapid method for high-resolution visualization of single-ion tracks

    Directory of Open Access Journals (Sweden)

    Masaaki Omichi

    2014-11-01

    Full Text Available Prompt determination of spatial points of single-ion tracks plays a key role in high-energy particle induced-cancer therapy and gene/plant mutations. In this study, a simple method for the high-resolution visualization of single-ion tracks without etching was developed through the use of polyacrylic acid (PAA-N, N’-methylene bisacrylamide (MBAAm blend films. One of the steps of the proposed method includes exposure of the irradiated films to water vapor for several minutes. Water vapor was found to promote the cross-linking reaction of PAA and MBAAm to form a bulky cross-linked structure; the ion-track scars were detectable at a nanometer scale by atomic force microscopy. This study demonstrated that each scar is easily distinguishable, and the amount of generated radicals of the ion tracks can be estimated by measuring the height of the scars, even in highly dense ion tracks. This method is suitable for the visualization of the penumbra region in a single-ion track with a high spatial resolution of 50 nm, which is sufficiently small to confirm that a single ion hits a cell nucleus with a size ranging between 5 and 20 μm.

  9. A simple and rapid method for high-resolution visualization of single-ion tracks

    Energy Technology Data Exchange (ETDEWEB)

    Omichi, Masaaki [Department of Applied Chemistry, Graduate School of Engineering, Osaka University, Osaka 565-0871 (Japan); Center for Collaborative Research, Anan National College of Technology, Anan, Tokushima 774-0017 (Japan); Choi, Wookjin; Sakamaki, Daisuke; Seki, Shu, E-mail: seki@chem.eng.osaka-u.ac.jp [Department of Applied Chemistry, Graduate School of Engineering, Osaka University, Osaka 565-0871 (Japan); Tsukuda, Satoshi [Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, Sendai, Miyagi 980-8577 (Japan); Sugimoto, Masaki [Japan Atomic Energy Agency, Takasaki Advanced Radiation Research Institute, Gunma, Gunma 370-1292 (Japan)

    2014-11-15

    Prompt determination of spatial points of single-ion tracks plays a key role in high-energy particle induced-cancer therapy and gene/plant mutations. In this study, a simple method for the high-resolution visualization of single-ion tracks without etching was developed through the use of polyacrylic acid (PAA)-N, N’-methylene bisacrylamide (MBAAm) blend films. One of the steps of the proposed method includes exposure of the irradiated films to water vapor for several minutes. Water vapor was found to promote the cross-linking reaction of PAA and MBAAm to form a bulky cross-linked structure; the ion-track scars were detectable at a nanometer scale by atomic force microscopy. This study demonstrated that each scar is easily distinguishable, and the amount of generated radicals of the ion tracks can be estimated by measuring the height of the scars, even in highly dense ion tracks. This method is suitable for the visualization of the penumbra region in a single-ion track with a high spatial resolution of 50 nm, which is sufficiently small to confirm that a single ion hits a cell nucleus with a size ranging between 5 and 20 μm.

  10. Rapid and Objective Assessment of Neural Function in Autism Spectrum Disorder Using Transient Visual Evoked Potentials.

    Directory of Open Access Journals (Sweden)

    Paige M Siper

    Full Text Available There is a critical need to identify biomarkers and objective outcome measures that can be used to understand underlying neural mechanisms in autism spectrum disorder (ASD. Visual evoked potentials (VEPs offer a noninvasive technique to evaluate the functional integrity of neural mechanisms, specifically visual pathways, while probing for disease pathophysiology.Transient VEPs (tVEPs were obtained from 96 unmedicated children, including 37 children with ASD, 36 typically developing (TD children, and 23 unaffected siblings (SIBS. A conventional contrast-reversing checkerboard condition was compared to a novel short-duration condition, which was developed to enable objective data collection from severely affected populations who are often excluded from electroencephalographic (EEG studies.Children with ASD showed significantly smaller amplitudes compared to TD children at two of the earliest critical VEP components, P60-N75 and N75-P100. SIBS showed intermediate responses relative to ASD and TD groups. There were no group differences in response latency. Frequency band analyses indicated significantly weaker responses for the ASD group in bands encompassing gamma-wave activity. Ninety-two percent of children with ASD were able to complete the short-duration condition compared to 68% for the standard condition.The current study establishes the utility of a short-duration tVEP test for use in children at varying levels of functioning and describes neural abnormalities in children with idiopathic ASD. Implications for excitatory/inhibitory balance as well as the potential application of VEP for use in clinical trials are discussed.

  11. IcyTree: rapid browser-based visualization for phylogenetic trees and networks.

    Science.gov (United States)

    Vaughan, Timothy G

    2017-08-01

    IcyTree is an easy-to-use application which can be used to visualize a wide variety of phylogenetic trees and networks. While numerous phylogenetic tree viewers exist already, IcyTree distinguishes itself by being a purely online tool, having a responsive user interface, supporting phylogenetic networks (ancestral recombination graphs in particular), and efficiently drawing trees that include information such as ancestral locations or trait values. IcyTree also provides intuitive panning and zooming utilities that make exploring large phylogenetic trees of many thousands of taxa feasible. IcyTree is a web application and can be accessed directly at http://tgvaughan.github.com/icytree . Currently supported web browsers include Mozilla Firefox and Google Chrome. IcyTree is written entirely in client-side JavaScript (no plugin required) and, once loaded, does not require network access to run. IcyTree is free software, and the source code is made available at http://github.com/tgvaughan/icytree under version 3 of the GNU General Public License. tgvaughan@gmail.com. © The Author(s) 2017. Published by Oxford University Press.

  12. Rapid Improvement in Visual Selective Attention Related to Action Video Gaming Experience

    Directory of Open Access Journals (Sweden)

    Nan Qiu

    2018-02-01

    Full Text Available A central issue in cognitive science is understanding how learning induces cognitive and neural plasticity, which helps illuminate the biological basis of learning. Research in the past few decades showed that action video gaming (AVG offered new, important perspectives on learning-related cognitive and neural plasticity. However, it is still unclear whether cognitive and neural plasticity is observable after a brief AVG session. Using behavioral and electrophysiological measures, this study examined the plasticity of visual selective attention (VSA associated with a 1 h AVG session. Both AVG experts and non-experts participated in this study. Their VSA was assessed prior to and after the AVG session. Within-group comparisons on the participants' performance before and after the AVG session showed improvements in response time in both groups and modulations of electrophysiological measures in the non-experts. Furthermore, between-group comparisons showed that the experts had superior VSA, relative to the non-experts, prior to the AVG session. These findings suggested an association between the plasticity of VSA and AVG. Most importantly, this study showed that the plasticity of VSA was observable after even a 1 h AVG session.

  13. Rapid Improvement in Visual Selective Attention Related to Action Video Gaming Experience.

    Science.gov (United States)

    Qiu, Nan; Ma, Weiyi; Fan, Xin; Zhang, Youjin; Li, Yi; Yan, Yuening; Zhou, Zhongliang; Li, Fali; Gong, Diankun; Yao, Dezhong

    2018-01-01

    A central issue in cognitive science is understanding how learning induces cognitive and neural plasticity, which helps illuminate the biological basis of learning. Research in the past few decades showed that action video gaming (AVG) offered new, important perspectives on learning-related cognitive and neural plasticity. However, it is still unclear whether cognitive and neural plasticity is observable after a brief AVG session. Using behavioral and electrophysiological measures, this study examined the plasticity of visual selective attention (VSA) associated with a 1 h AVG session. Both AVG experts and non-experts participated in this study. Their VSA was assessed prior to and after the AVG session. Within-group comparisons on the participants' performance before and after the AVG session showed improvements in response time in both groups and modulations of electrophysiological measures in the non-experts. Furthermore, between-group comparisons showed that the experts had superior VSA, relative to the non-experts, prior to the AVG session. These findings suggested an association between the plasticity of VSA and AVG. Most importantly, this study showed that the plasticity of VSA was observable after even a 1 h AVG session.

  14. SCARDEC: a new technique for the rapid determination of seismic moment magnitude, focal mechanism and source time functions for large earthquakes using body-wave deconvolution

    Science.gov (United States)

    Vallée, M.; Charléty, J.; Ferreira, A. M. G.; Delouis, B.; Vergoz, J.

    2011-01-01

    Accurate and fast magnitude determination for large, shallow earthquakes is of key importance for post-seismic response and tsumami alert purposes. When no local real-time data are available, which is today the case for most subduction earthquakes, the first information comes from teleseismic body waves. Standard body-wave methods give accurate magnitudes for earthquakes up to Mw= 7-7.5. For larger earthquakes, the analysis is more complex, because of the non-validity of the point-source approximation and of the interaction between direct and surface-reflected phases. The latter effect acts as a strong high-pass filter, which complicates the magnitude determination. We here propose an automated deconvolutive approach, which does not impose any simplifying assumptions about the rupture process, thus being well adapted to large earthquakes. We first determine the source duration based on the length of the high frequency (1-3 Hz) signal content. The deconvolution of synthetic double-couple point source signals—depending on the four earthquake parameters strike, dip, rake and depth—from the windowed real data body-wave signals (including P, PcP, PP, SH and ScS waves) gives the apparent source time function (STF). We search the optimal combination of these four parameters that respects the physical features of any STF: causality, positivity and stability of the seismic moment at all stations. Once this combination is retrieved, the integration of the STFs gives directly the moment magnitude. We apply this new approach, referred as the SCARDEC method, to most of the major subduction earthquakes in the period 1990-2010. Magnitude differences between the Global Centroid Moment Tensor (CMT) and the SCARDEC method may reach 0.2, but values are found consistent if we take into account that the Global CMT solutions for large, shallow earthquakes suffer from a known trade-off between dip and seismic moment. We show by modelling long-period surface waves of these events that

  15. Training and transfer of training in rapid visual search for camouflaged targets.

    Directory of Open Access Journals (Sweden)

    Mark B Neider

    Full Text Available Previous examinations of search under camouflage conditions have reported that performance improves with training and that training can engender near perfect transfer to similar, but novel camouflage-type displays [1]. What remains unclear, however, are the cognitive mechanisms underlying these training improvements and transfer benefits. On the one hand, improvements and transfer benefits might be associated with higher-level overt strategy shifts, such as through the restriction of eye movements to target-likely (background display regions. On the other hand, improvements and benefits might be related to the tuning of lower-level perceptual processes, such as figure-ground segregation. To decouple these competing possibilities we had one group of participants train on camouflage search displays and a control group train on non-camouflage displays. Critically, search displays were rapidly presented, precluding eye movements. Before and following training, all participants completed transfer sessions in which they searched novel displays. We found that search performance on camouflage displays improved with training. Furthermore, participants who trained on camouflage displays suffered no performance costs when searching novel displays following training. Our findings suggest that training to break camouflage is related to the tuning of perceptual mechanisms and not strategic shifts in overt attention.

  16. Visual working memory modulates low-level saccade target selection: Evidence from rapidly generated saccades in the global effect paradigm

    Science.gov (United States)

    Hollingworth, Andrew; Matsukura, Michi; Luck, Steven J.

    2013-01-01

    In three experiments, we examined the influence of visual working memory (VWM) on the metrics of saccade landing position in a global effect paradigm. Participants executed a saccade to the more eccentric object in an object pair appearing on the horizontal midline, to the left or right of central fixation. While completing the saccade task, participants maintained a color in VWM for an unrelated memory task. Either the color of the saccade target matched the memory color (target match), the color of the distractor matched the memory color (distractor match), or the colors of neither object matched the memory color (no match). In the no-match condition, saccades tended to land at the midpoint between the two objects: the global, or averaging, effect. However, when one of the two objects matched VWM, the distribution of landing position shifted toward the matching object, both for target match and for distractor match. VWM modulation of landing position was observed even for the fastest quartile of saccades, with a mean latency as low as 112 ms. Effects of VWM on such rapidly generated saccades, with latencies in the express-saccade range, indicate that VWM interacts with the initial sweep of visual sensory processing, modulating perceptual input to oculomotor systems and thereby biasing oculomotor selection. As a result, differences in memory match produce effects on landing position similar to the effects generated by differences in physical salience. PMID:24190909

  17. Visual working memory modulates low-level saccade target selection: evidence from rapidly generated saccades in the global effect paradigm.

    Science.gov (United States)

    Hollingworth, Andrew; Matsukura, Michi; Luck, Steven J

    2013-11-04

    In three experiments, we examined the influence of visual working memory (VWM) on the metrics of saccade landing position in a global effect paradigm. Participants executed a saccade to the more eccentric object in an object pair appearing on the horizontal midline, to the left or right of central fixation. While completing the saccade task, participants maintained a color in VWM for an unrelated memory task. Either the color of the saccade target matched the memory color (target match), the color of the distractor matched the memory color (distractor match), or the colors of neither object matched the memory color (no match). In the no-match condition, saccades tended to land at the midpoint between the two objects: the global, or averaging, effect. However, when one of the two objects matched VWM, the distribution of landing position shifted toward the matching object, both for target match and for distractor match. VWM modulation of landing position was observed even for the fastest quartile of saccades, with a mean latency as low as 112 ms. Effects of VWM on such rapidly generated saccades, with latencies in the express-saccade range, indicate that VWM interacts with the initial sweep of visual sensory processing, modulating perceptual input to oculomotor systems and thereby biasing oculomotor selection. As a result, differences in memory match produce effects on landing position similar to the effects generated by differences in physical salience.

  18. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  19. The Restoration Rapid Assessment Tool: An Access/Visual Basic application

    Science.gov (United States)

    Hiebert, Ron; Larson, D.L.; Thomas, K.; Tancreto, N.; Haines, D.; Richey, A.; Dow, T.; Drees, L.

    2009-01-01

    Managers of parks and natural areas are increasingly faced with difficult decisions concerning restoration of disturbed lands. Financial and workforce resources often limit these restoration efforts, and rarely can a manager afford to address all concerns within the region of interest. With limited resources, managers and scientists have to decide which areas will be targeted for restoration and the restoration treatments to use in these areas. A broad range of approaches are used to make such decisions, from well-researched expert opinions (Cipollini et al. 2005) to gut feeling, with variable degrees of input from site visits, data collection, and data analysis used to support the decision. A standardized approach including an analytical assessment of site characteristics based on the best information available, with a written or electronic record of all the steps taken along the way, would make comparisons among a group of sites easier and lend credibility through use of common, documented criteria at all sites. In response to these concerns, we have developed the Restoration Rapid Assessment Tool (RRAT). RRAT is based on field observations of key indicators of site degradation, stressors influencing the site, value of the site with respect to larger management objectives, likelihood of achieving the management goals, and logistical constraints to restoration. The purpose of RRAT is not to make restoration decisions or prescribe methods, but rather to ensure that a basic set of pertinent issues are considered for each site and to facilitate comparisons among sites. Several concepts have been central to the development of RRAT. First, the management goal (also known as desired future condition) of any site under evaluation should be defined before the field evaluation begins. Second, the evaluation should be based upon readily observable indicators so as to avoid cumbersome field methods. Third, the ease with which site stressors can be ameliorated must be

  20. Chance findings about early holocene tidal marshes of Grays Harbor, Washington, in relation to rapidly rising seas and great subduction earthquakes

    Science.gov (United States)

    Phipps, James B.; Hemphill-Haley, Eileen; Atwater, Brian F.

    2015-06-18

    Tidal marshes commonly build upward apace with gradual rise in the level of the sea. It is expected, however, that few tidal marshes will keep up with accelerated sea-level rise later in this century. Tidal marshes have been drowned, moreover, after subsiding during earthquakes.

  1. Rapid Visual Site Analysis for Post-disaster Landscape Planning: Expanding the Range of Choice in a Tsunami-affected Town in Japan

    Directory of Open Access Journals (Sweden)

    James Wescoat

    2012-12-01

    Full Text Available Problem statement In post-disaster situations, it is often necessary to undertake rapid visual site reconnaissance to characterise patterns of damage and identify reconstruction opportunities and constraints. Rapid visual site analysis can occur over a period of hours to days rather than weeks to months. The time constraint is often necessary to assess the viability of initial reconstruction scenarios and help broaden the range of choice among site planning options. Rapid assessment can also minimise the use of scarce local post-disaster resources during the initial reconnaissance phases of planning. Because it involves visual methods rather than equipment-intensive survey techniques, it serves as an initial scoping of alternatives. It may follow emergency shelter response planning methods (for example, Sphere Project, 2011, ch 4 and be followed by more comprehensive site mapping and screening. This action–research project reviews the literature on post-disaster site analysis with an emphasis on the tsunami-affected area of north-eastern Japan. Because research on rapid visual site analysis in post-disaster contexts is limited, we combined field-based site analysis methods, adapted for post-disaster planning, with visual methods for assessing seismic and tsunami hazards.

  2. I RAN Fast and I Remembered What I Read: The Relationship between Reading, Rapid Automatic Naming, and Auditory and Visual Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Sheila G. Crewther

    2011-05-01

    Full Text Available Although rapid automatic naming (RAN speed and short-term auditory memory are widely recognised as good predictors of reading ability in most age groups, the predictive value of short-term memory for visually presented digits for reading and RAN in young typically developing learner readers (mean age 91.5 months has seldom been investigated. We found that visual digit span is a better predictor of reading ability than auditory digit span in learner readers. A significant correlation has also been found between RAN speed and visual, but not auditory digit span. These results suggests that RAN speed may be a good predictor of a child's future reading ability and eventual fluency because like visual digit span, it is a measure of rate of access to memory for the visual icons and their semantic name and meaning. The results also suggest that auditory memory is not an important factor in young children learning to read.

  3. Exposure to rapid succession disasters: a study of residents at the epicenter of the Chilean Bío Bío earthquake.

    Science.gov (United States)

    Garfin, Dana Rose; Silver, Roxane Cohen; Ugalde, Francisco Javier; Linn, Heiko; Inostroza, Manuel

    2014-08-01

    We examined cumulative and specific types of trauma exposure as predictors of distress and impairment following a multifaceted community disaster. Approximately 3 months after the 8.8 magnitude earthquake, tsunami, and subsequent looting in Bío Bío, Chile, face-to-face interviews were conducted in 5 provinces closest to the epicenter. Participants (N = 1,000) were randomly selected using military topographic records and census data. Demographics, exposure to discrete components of the disaster (earthquake, tsunami, looting), and exposure to secondary stressors (property loss, injury, death) were evaluated as predictors of posttraumatic stress (PTS) symptoms, global distress, and functional impairment. Prevalence of probable posttraumatic stress disorder was 18.95%. In adjusted models examining specificity of exposure to discrete disaster components and secondary stressors, PTS symptoms and global distress were associated with earthquake intensity, tsunami exposure, and injury to self/close other. Increased functional impairment correlated with earthquake intensity and injury to self/close other. In adjusted models, cumulative exposure to secondary stressors correlated with PTS symptoms, global distress, and functional impairment; cumulative count of exposure to discrete disaster components did not. Exploratory analyses indicated that, beyond direct exposure, appraising the tsunami and looting as the worst components of the disaster correlated with greater media exposure and higher socioeconomic status, respectively. Overall, threat to life indicators correlated with worse outcomes. As failure of government tsunami warnings resulted in many deaths, findings suggest disasters compounded by human errors may be particularly distressing. We advance theory regarding cumulative and specific trauma exposure as predictors of postdisaster distress and provide information for enhancing targeted postdisaster interventions. (c) 2014 APA, all rights reserved.

  4. Rapid Categorization of Human and Ape Faces in 9-Month-Old Infants Revealed by Fast Periodic Visual Stimulation.

    Science.gov (United States)

    Peykarjou, Stefanie; Hoehl, Stefanie; Pauen, Sabina; Rossion, Bruno

    2017-10-02

    This study investigates categorization of human and ape faces in 9-month-olds using a Fast Periodic Visual Stimulation (FPVS) paradigm while measuring EEG. Categorization responses are elicited only if infants discriminate between different categories and generalize across exemplars within each category. In study 1, human or ape faces were presented as standard and deviant stimuli in upright and inverted trials. Upright ape faces presented among humans elicited strong categorization responses, whereas responses for upright human faces and for inverted ape faces were smaller. Deviant inverted human faces did not elicit categorization. Data were best explained by a model with main effects of species and orientation. However, variance of low-level image characteristics was higher for the ape than the human category. Variance was matched to replicate this finding in an independent sample (study 2). Both human and ape faces elicited categorization in upright and inverted conditions, but upright ape faces elicited the strongest responses. Again, data were best explained by a model of two main effects. These experiments demonstrate that 9-month-olds rapidly categorize faces, and unfamiliar faces presented among human faces elicit increased categorization responses. This likely reflects habituation for the familiar standard category, and stronger release for the unfamiliar category deviants.

  5. Effects of mora deletion, nonword repetition, rapid naming, and visual search performance on beginning reading in Japanese.

    Science.gov (United States)

    Kobayashi, Maya Shiho; Haynes, Charles W; Macaruso, Paul; Hook, Pamela E; Kato, Junko

    2005-06-01

    This study examined the extent to which mora deletion (phonological analysis), nonword repetition (phonological memory), rapid automatized naming (RAN), and visual search abilities predict reading in Japanese kindergartners and first graders. Analogous abilities have been identified as important predictors of reading skills in alphabetic languages like English. In contrast to English, which is based on grapheme-phoneme relationships, the primary components of Japanese orthography are two syllabaries-hiragana and katakana (collectively termed "kana")-and a system of morphosyllabic symbols (kanji). Three RAN tasks (numbers, objects, syllabary symbols [hiragana]) were used with kindergartners, with an additional kanji RAN task included for first graders. Reading measures included accuracy and speed of passage reading for kindergartners and first graders, and reading comprehension for first graders. In kindergartners, hiragana RAN and number RAN were the only significant predictors of reading accuracy and speed. In first graders, kanji RAN and hiragana RAN predicted reading speed, whereas accuracy was predicted by mora deletion. Reading comprehension was predicted by kanji RAN, mora deletion, and nonword repetition. Although number RAN did not contribute unique variance to any reading measure, it correlated highly with kanji RAN. Implications of these findings for research and practice are discussed.

  6. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  7. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  8. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  9. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    International Nuclear Information System (INIS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-01-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing

  10. Rapid exposure and loss estimates for the May 12, 2008 Mw 7.9 Wenchuan earthquake provided by the U.S. Geological Survey's PAGER system

    Science.gov (United States)

    Earle, P.S.; Wald, D.J.; Allen, T.I.; Jaiswal, K.S.; Porter, K.A.; Hearne, M.G.

    2008-01-01

    One half-hour after the May 12th Mw 7.9 Wenchuan, China earthquake, the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) system distributed an automatically generated alert stating that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater). It was immediately clear that a large-scale disaster had occurred. These alerts were widely distributed and referenced by the major media outlets and used by governments, scientific, and relief agencies to guide their responses. The PAGER alerts and Web pages included predictive ShakeMaps showing estimates of ground shaking, maps of population density, and a list of estimated intensities at impacted cities. Manual, revised alerts were issued in the following hours that included the dimensions of the fault rupture. Within a half-day, PAGER’s estimates of the population exposed to strong shaking levels stabilized at 5.2 million people. A coordinated research effort is underway to extend PAGER’s capability to include estimates of the number of casualties. We are pursuing loss models that will allow PAGER the flexibility to use detailed inventory and engineering results in regions where these data are available while also calculating loss estimates in regions where little is known about the type and strength of the built infrastructure. Prototype PAGER fatality estimates are currently implemented and can be manually triggered. In the hours following the Wenchuan earthquake, these models predicted fatalities in the tens of thousands.

  11. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study.

    Science.gov (United States)

    Neale, Chris; Johnston, Patrick; Hughes, Matthew; Scholey, Andrew

    2015-01-01

    The Rapid Visual Information Processing (RVIP) task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI) outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16). The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis), such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi), parietal (precuenus, inferior parietal lobe) and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.

  12. Functional Activation during the Rapid Visual Information Processing Task in a Middle Aged Cohort: An fMRI Study.

    Directory of Open Access Journals (Sweden)

    Chris Neale

    Full Text Available The Rapid Visual Information Processing (RVIP task, a serial discrimination task where task performance believed to reflect sustained attention capabilities, is widely used in behavioural research and increasingly in neuroimaging studies. To date, functional neuroimaging research into the RVIP has been undertaken using block analyses, reflecting the sustained processing involved in the task, but not necessarily the transient processes associated with individual trial performance. Furthermore, this research has been limited to young cohorts. This study assessed the behavioural and functional magnetic resonance imaging (fMRI outcomes of the RVIP task using both block and event-related analyses in a healthy middle aged cohort (mean age = 53.56 years, n = 16. The results show that the version of the RVIP used here is sensitive to changes in attentional demand processes with participants achieving a 43% accuracy hit rate in the experimental task compared with 96% accuracy in the control task. As shown by previous research, the block analysis revealed an increase in activation in a network of frontal, parietal, occipital and cerebellar regions. The event related analysis showed a similar network of activation, seemingly omitting regions involved in the processing of the task (as shown in the block analysis, such as occipital areas and the thalamus, providing an indication of a network of regions involved in correct trial performance. Frontal (superior and inferior frontal gryi, parietal (precuenus, inferior parietal lobe and cerebellar regions were shown to be active in both the block and event-related analyses, suggesting their importance in sustained attention/vigilance. These networks and the differences between them are discussed in detail, as well as implications for future research in middle aged cohorts.

  13. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    and kurtosis both tend to zero for large earthquake rates: for the Gaussian law, these values are identically zero. A calculation of the NBD skewness and kurtosis levels based on the values of the first two statistical moments of the distribution, shows rapid increase of these upper moments levels. However, the observed catalogue values of skewness and kurtosis are rising even faster. This means that for small time intervals, the earthquake number distribution is even more heavy-tailed than the NBD predicts. Therefore for small time intervals, we propose using empirical number distributions appropriately smoothed for testing forecasted earthquake numbers.

  14. Visualization of strong around motion calculated from the numerical simulation of Hyogo-ken Nanbu earthquake; Suchi simulation de miru Hyogoken nanbu jishin no kyoshindo

    Energy Technology Data Exchange (ETDEWEB)

    Furumura, T [Hokkaido Univ. of Education, Sapporo (Japan); Koketsu, K [The University of Tokyo, Tokyo (Japan). Earthquake Research Institute

    1996-10-01

    Hyogo-ken Nanbu earthquake with a focus in the Akashi straits has given huge earthquake damages in and around Awaji Island and Kobe City in 1995. It is clear that the basement structure, which is steeply deepened at Kobe City from Rokko Mountains towards the coast, and the focus under this related closely to the local generation of strong ground motion. Generation process of the strong ground motion was discussed using 2D and 3D numerical simulation methods. The 3D pseudospectral method was used for the calculation. Space of 51.2km{times}25.6km{times}25.6km was selected for the calculation. This space was discretized with the lattice interval of 200m. Consequently, it was found that the basement structure with a steeply deepened basement, soft and weak geological structure thickly deposited on the basement, and earthquake faults running under the boundary of base rock and sediments related greatly to the generation of strong ground motion. Numerical simulation can be expected to predict the strong ground motion by shallow earthquakes. 9 refs., 7 figs.

  15. Blindness and Visual Impairment Profile and Rapid Assessment of Avoidable Blindness in South East Asia: Analysis of New Data. 2017 APAO Holmes Lecture.

    Science.gov (United States)

    Das, Taraprasad

    2018-03-13

    The International Agency for Prevention of Blindness (IAPB) South East Asia region (SEAR) that consists of 11 countries contains 26% of the world's population (1,761,000,000). In this region 12 million are blind and 78.5 million are visually impaired. This amounts to 30% of global blindness and 32% of global visual impairment. Rapid assessment of avoidable blindness (RAAB) survey analysis. RAAB, either a repeat or a first time survey, was completed in 8 countries in this decade (2010 onwards). These include Bangladesh, Bhutan, India, Indonesia, Maldives, Sri Lanka, Thailand, and Timor Leste. Cataract is the principal cause of blindness and severe visual impairment in all countries. Refractive error is the principal cause of moderate visual impairment in 4 countries: Bangladesh, India, Maldives, and Sri Lanka; cataract continues to be the principal cause of moderate visual impairment in 4 other countries: Bhutan, Indonesia, Thailand, and Timor Leste. Outcome of cataract surgery is suboptimal in the Maldives and Timor Leste. Rigorous focus is necessary to improve cataract surgery outcomes and correction of refractive error without neglecting the quality of care. At the same time allowances must be made for care of the emerging causes of visual impairment and blindness such as glaucoma and posterior segment disorders, particularly diabetic retinopathy. Copyright 2018 Asia-Pacific Academy of Ophthalmology.

  16. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  17. Earthquake Facts

    Science.gov (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  18. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  19. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  20. Rapid adaptation to a novel light environment: The importance of ontogeny and phenotypic plasticity in shaping the visual system of Nicaraguan Midas cichlid fish (Amphilophus citrinellus spp.).

    Science.gov (United States)

    Härer, Andreas; Torres-Dowdall, Julián; Meyer, Axel

    2017-10-01

    Colonization of novel habitats is typically challenging to organisms. In the initial stage after colonization, approximation to fitness optima in the new environment can occur by selection acting on standing genetic variation, modification of developmental patterns or phenotypic plasticity. Midas cichlids have recently colonized crater Lake Apoyo from great Lake Nicaragua. The photic environment of crater Lake Apoyo is shifted towards shorter wavelengths compared to great Lake Nicaragua and Midas cichlids from both lakes differ in visual sensitivity. We investigated the contribution of ontogeny and phenotypic plasticity in shaping the visual system of Midas cichlids after colonizing this novel photic environment. To this end, we measured cone opsin expression both during development and after experimental exposure to different light treatments. Midas cichlids from both lakes undergo ontogenetic changes in cone opsin expression, but visual sensitivity is consistently shifted towards shorter wavelengths in crater lake fish, which leads to a paedomorphic retention of their visual phenotype. This shift might be mediated by lower levels of thyroid hormone in crater lake Midas cichlids (measured indirectly as dio2 and dio3 gene expression). Exposing fish to different light treatments revealed that cone opsin expression is phenotypically plastic in both species during early development, with short and long wavelength light slowing or accelerating ontogenetic changes, respectively. Notably, this plastic response was maintained into adulthood only in the derived crater lake Midas cichlids. We conclude that the rapid evolution of Midas cichlids' visual system after colonizing crater Lake Apoyo was mediated by a shift in visual sensitivity during ontogeny and was further aided by phenotypic plasticity during development. © 2017 John Wiley & Sons Ltd.

  1. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  2. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  3. The Observation of Fault Finiteness and Rapid Velocity Variation in Pnl Waveforms for the Mw 6.5, San Simeon, California Earthquake

    Science.gov (United States)

    Konca, A. O.; Ji, C.; Helmberger, D. V.

    2004-12-01

    We observed the effect of the fault finiteness in the Pnl waveforms from regional distances (4° to 12° ) for the Mw6.5 San Simeon Earthquake on 22 December 2003. We aimed to include more of the high frequencies (2 seconds and longer periods) than the studies that use regional data for focal solutions (5 to 8 seconds and longer periods). We calculated 1-D synthetic seismograms for the Pn_l portion for both a point source, and a finite fault solution. The comparison of the point source and finite fault waveforms with data show that the first several seconds of the point source synthetics have considerably higher amplitude than the data, while finite fault does not have a similar problem. This can be explained by reversely polarized depth phases overlapping with the P waves from the later portion of the fault, and causing smaller amplitudes for the beginning portion of the seismogram. This is clearly a finite fault phenomenon; therefore, can not be explained by point source calculations. Moreover, the point source synthetics, which are calculated with a focal solution from a long period regional inversion, are overestimating the amplitude by three to four times relative to the data amplitude, while finite fault waveforms have the similar amplitudes to the data. Hence, a moment estimation based only on the point source solution of the regional data could have been wrong by half of magnitude. We have also calculated the shifts of synthetics relative to data to fit the seismograms. Our results reveal that the paths from Central California to the south are faster than to the paths to the east and north. The P wave arrival to the TUC station in Arizona is 4 seconds earlier than the predicted Southern California model, while most stations to the east are delayed around 1 second. The observed higher uppermost mantle velocities to the south are consistent with some recent tomographic models. Synthetics generated with these models significantly improves the fits and the

  4. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  5. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  6. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  7. Consciousness wanted, attention found: Reasons for the advantage of the left visual field in identifying T2 among rapidly presented series.

    Science.gov (United States)

    Verleger, Rolf; Śmigasiewicz, Kamila

    2015-09-01

    Everyday experience suggests that people are equally aware of events in both hemi-fields. However, when two streams of stimuli are rapidly presented left and right containing two targets, the second target is better identified in the left than in the right visual field. This might be considered evidence for a right-hemisphere advantage in generating conscious percepts. However, this putative asymmetry of conscious perception cannot be measured independently of participants' access to their conscious percepts, and there is actually evidence from split-brain patients for the reverse, left-hemisphere advantage in having access to conscious percepts. Several other topics were studied in search of the responsible mechanism, among others: Mutual inhibition of hemispheres, cooperation of hemispheres in perceiving midline stimuli, and asymmetries in processing various perceptual inputs. Directing attention by salient cues turned out to be one of the few mechanisms capable of modifying the left visual-field advantage in this paradigm. Thus, this left visual-field advantage is best explained by the notion of a right-hemisphere advantage in directing attention to salient events. Dovetailing with the pathological asymmetries of attention after right-hemisphere lesions and with asymmetries of brain activation when healthy participants shift their attention, the present results extend that body of evidence by demonstrating unusually large and reliable behavioral asymmetries for attention-directing processes in healthy participants. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Rapid visualization of fingerprints on various surfaces using ZnO superstructures prepared via simple combustion route

    Directory of Open Access Journals (Sweden)

    N.H. Deepthi

    2018-03-01

    Full Text Available A simple solution combustion route has been used to prepare ZnO nanopowders (NPs using different barbiturates (Barbituric acid, 1, 3-dimethyl barbiturates and 2-thiobarbiturates as fuels. The obtained product was well characterized by powder X-ray diffraction (PXRD, scanning electron microscope (SEM, ultraviolet-visible Spectroscope (UV-Vis and Photoluminescence (PL. The PXRD results confirm the hexagonal phase of the material. The detailed structural analysis is performed by Rietveld refinement method. The energy band gap of NPs is found to be in the range of 3.31 - 3.49 eV. The growth mechanism for the formation of 3D micro-architectures is discussed in detail. The PL emission spectrum shows a broad emission peak at 502 nm upon an 406 nm excitation wavelength. The ZnO NPs can be used for the visualization of latent finger prints (LFPs under normal light on various porous and non-porous surfaces. In this case, the visualized LFPs are found to be excellent compared to the commercially available powders. Keywords: Zinc oxide, Barbiturates, Photoluminescence, Latent fingerprint

  9. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  10. Rapid and low-invasive functional brain mapping by realtime visualization of high gamma activity for awake craniotomy.

    Science.gov (United States)

    Kamada, K; Ogawa, H; Kapeller, C; Prueckl, R; Guger, C

    2014-01-01

    For neurosurgery with an awake craniotomy, the critical issue is to set aside enough time to identify eloquent cortices by electrocortical stimulation (ECS). High gamma activity (HGA) ranging between 80 and 120 Hz on electrocorticogram (ECoG) is assumed to reflect localized cortical processing. In this report, we used realtime HGA mapping and functional magnetic resonance imaging (fMRI) for rapid and reliable identification of motor and language functions. Three patients with intra-axial tumors in their dominant hemisphere underwent preoperative fMRI and lesion resection with an awake craniotomy. All patients showed significant fMRI activation evoked by motor and language tasks. After the craniotomy, we recorded ECoG activity by placing subdural grids directly on the exposed brain surface. Each patient performed motor and language tasks and demonstrated realtime HGA dynamics in hand motor areas and parts of the inferior frontal gyrus. Sensitivity and specificity of HGA mapping were 100% compared to ECS mapping in the frontal lobe, which suggested HGA mapping precisely indicated eloquent cortices. The investigation times of HGA mapping was significantly shorter than that of ECS mapping. Specificities of the motor and language-fMRI, however, did not reach 85%. The results of HGA mapping was mostly consistent with those of ECS mapping, although fMRI tended to overestimate functional areas. This novel technique enables rapid and accurate functional mapping.

  11. Application of τc*Pd in earthquake early warning

    Science.gov (United States)

    Huang, Po-Lun; Lin, Ting-Li; Wu, Yih-Min

    2015-03-01

    Rapid assessment of damage potential and size of an earthquake at the station is highly demanded for onsite earthquake early warning. We study the application of τc*Pd for its estimation on the earthquake size using 123 events recorded by the borehole stations of KiK-net in Japan. The new type of earthquake size determined by τc*Pd is more related to the damage potential. We find that τc*Pd provides another parameter to measure the size of earthquake and the threshold to warn strong ground motion.

  12. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    Science.gov (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  13. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  14. Rapid visual detection of quaternary ammonium surfactants using citrate-capped silver nanoparticles (Ag NPs) based on hydrophobic effect.

    Science.gov (United States)

    Zheng, Li-Qing; Yu, Xiao-Dong; Xu, Jing-Juan; Chen, Hong-Yuan

    2014-01-01

    In this work, a rapid, sensitive and low-cost colorimetric method for detection of quaternary ammonium surfactants using citrate-capped silver nanoparticles (Ag NPs) was developed. The quaternary ammonium surfactants induce the aggregation of Ag NPs through the hydrophobic effect, which is a novel aggregation mechanism of Ag NPs. The addition of cationic surfactant results in color change of Ag NPs solution from yellow to red and finally to colorless, which is due to the broadening of the surface plasmon band. The color change was monitored using a UV-vis spectrophotometer. The LOD of different cationic surfactants was in the range of 0.5-5 µM. More importantly, this detection method was successfully utilized to the disinfectant residual sample. Crown Copyright © 2013 Published by Elsevier B.V. All rights reserved.

  15. Rapid and minimum invasive functional brain mapping by real-time visualization of high gamma activity during awake craniotomy.

    Science.gov (United States)

    Ogawa, Hiroshi; Kamada, Kyousuke; Kapeller, Christoph; Hiroshima, Satoru; Prueckl, Robert; Guger, Christoph

    2014-11-01

    Electrocortical stimulation (ECS) is the gold standard for functional brain mapping during an awake craniotomy. The critical issue is to set aside enough time to identify eloquent cortices by ECS. High gamma activity (HGA) ranging between 80 and 120 Hz on electrocorticogram is assumed to reflect localized cortical processing. In this report, we used real-time HGA mapping and functional neuronavigation integrated with functional magnetic resonance imaging (fMRI) for rapid and reliable identification of motor and language functions. Four patients with intra-axial tumors in their dominant hemisphere underwent preoperative fMRI and lesion resection with an awake craniotomy. All patients showed significant fMRI activation evoked by motor and language tasks. During the craniotomy, we recorded electrocorticogram activity by placing subdural grids directly on the exposed brain surface. Each patient performed motor and language tasks and demonstrated real-time HGA dynamics in hand motor areas and parts of the inferior frontal gyrus. Sensitivity and specificity of HGA mapping were 100% compared with ECS mapping in the frontal lobe, which suggested HGA mapping precisely indicated eloquent cortices. We found different HGA dynamics of language tasks in frontal and temporal regions. Specificities of the motor and language-fMRI did not reach 85%. The results of HGA mapping was mostly consistent with those of ECS mapping, although fMRI tended to overestimate functional areas. This novel technique enables rapid and accurate identification of motor and frontal language areas. Furthermore, real-time HGA mapping sheds light on underlying physiological mechanisms related to human brain functions. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Development and Application of Loop-Mediated Isothermal Amplification Assays for Rapid Visual Detection of cry2Ab and cry3A Genes in Genetically-Modified Crops

    Directory of Open Access Journals (Sweden)

    Feiwu Li

    2014-08-01

    Full Text Available The cry2Ab and cry3A genes are two of the most important insect-resistant exogenous genes and had been widely used in genetically-modified crops. To develop more effective alternatives for the quick identification of genetically-modified organisms (GMOs containing these genes, a rapid and visual loop-mediated isothermal amplification (LAMP method to detect the cry2Ab and cry3A genes is described in this study. The LAMP assay can be finished within 60 min at an isothermal condition of 63 °C. The derived LAMP products can be obtained by a real-time turbidimeter via monitoring the white turbidity or directly observed by the naked eye through adding SYBR Green I dye. The specificity of the LAMP assay was determined by analyzing thirteen insect-resistant genetically-modified (GM crop events with different Bt genes. Furthermore, the sensitivity of the LAMP assay was evaluated by diluting the template genomic DNA. Results showed that the limit of detection of the established LAMP assays was approximately five copies of haploid genomic DNA, about five-fold greater than that of conventional PCR assays. All of the results indicated that this established rapid and visual LAMP assay was quick, accurate and cost effective, with high specificity and sensitivity. In addition, this method does not need specific expensive instruments or facilities, which can provide a simpler and quicker approach to detecting the cry2Ab and cry3A genes in GM crops, especially for on-site, large-scale test purposes in the field.

  17. Development and application of loop-mediated isothermal amplification assays for rapid visual detection of cry2Ab and cry3A genes in genetically-modified crops.

    Science.gov (United States)

    Li, Feiwu; Yan, Wei; Long, Likun; Qi, Xing; Li, Congcong; Zhang, Shihong

    2014-08-27

    The cry2Ab and cry3A genes are two of the most important insect-resistant exogenous genes and had been widely used in genetically-modified crops. To develop more effective alternatives for the quick identification of genetically-modified organisms (GMOs) containing these genes, a rapid and visual loop-mediated isothermal amplification (LAMP) method to detect the cry2Ab and cry3A genes is described in this study. The LAMP assay can be finished within 60 min at an isothermal condition of 63 °C. The derived LAMP products can be obtained by a real-time turbidimeter via monitoring the white turbidity or directly observed by the naked eye through adding SYBR Green I dye. The specificity of the LAMP assay was determined by analyzing thirteen insect-resistant genetically-modified (GM) crop events with different Bt genes. Furthermore, the sensitivity of the LAMP assay was evaluated by diluting the template genomic DNA. Results showed that the limit of detection of the established LAMP assays was approximately five copies of haploid genomic DNA, about five-fold greater than that of conventional PCR assays. All of the results indicated that this established rapid and visual LAMP assay was quick, accurate and cost effective, with high specificity and sensitivity. In addition, this method does not need specific expensive instruments or facilities, which can provide a simpler and quicker approach to detecting the cry2Ab and cry3A genes in GM crops, especially for on-site, large-scale test purposes in the field.

  18. Rapid and Sensitive Detection of sFAT-1 Transgenic Pigs by Visual Loop-Mediated Isothermal Amplification.

    Science.gov (United States)

    Tao, Chenyu; Yang, Yalan; Li, Xunbi; Zheng, Xinmin; Ren, Hongyan; Li, Kui; Zhou, Rong

    2016-07-01

    Genetically modified (GM) livestock have the potential to contribute to improving the environment and human health, with consumption of fewer resources and reduced waste production. However, the transgene process also poses risks. The safety assessment and control of transgenic animal products have drawn wide attention, and the relevant regulations and technology are being developed. Quick testing technology plays a significant role in on-site and customs sampling. Nowadays, loop-mediated isothermal amplification (LAMP) was widely applied in nucleic acid analysis because of its simplicity, rapidity, high efficiency and specificity. In this study, a specific, sensitive detection system for detecting sFAT-1 transgenic pigs was designed. A set of six primers including two loop primers was designed for the target sequence. The DNA samples were amplified in less than 1 h at the optimized temperature and detecting by both Nephelometer LA-320c and unaided eyes directly adding calcein. The detection limit of sFAT-1 LAMP was as low as 1.26 ng/μL. Furthermore, blind tests of transgenic and non-transgenic DNA samples were all correctly detected. Hence, the results in this study demonstrated that LAMP is a very useful tool for transgenic detection.

  19. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  20. A simple, rapid method to isolate salt glands for three-dimensional visualization, fluorescence imaging and cytological studies

    Directory of Open Access Journals (Sweden)

    Lim Tit-Meng

    2010-10-01

    Full Text Available Abstract Background Some plants inhabiting saline environment remove salts via the salt glands embedded in the epidermal tissues. Cytological studies of salt glands will provide valuable information to our understanding of the secretory process. Previous studies on salt gland histology relied mainly on two-dimensional microscopic observations of microtome sections. Optical sectioning properties of confocal laser scanning microscope offer alternative approach for obtaining three-dimensional structural information of salt glands. Difficulty in light penetration through intact leaves and interference from neighbouring leaf cells, however, impede the acquiring of good optical salt gland sections and limit its applications in salt gland imaging. Freeing the glands from adjacent leaf tissues will allow better manipulations for three-dimensional imaging through confocal laser scanning microscopy. Results Here, we present a simple and fast method for the isolation of individual salt glands released from the interference of neighbouring cells. About 100-200 salt glands could be isolated from just one cm2 of Avicennia officinalis leaf within hours and microscopic visualization of isolated salt glands was made possible within a day. Using these isolated glands, confocal laser scanning microscopic techniques could be applied and better resolution salt gland images could be achieved. By making use of their intrinsic fluorescent properties, optical sections of the gland cells could be acquired without the use of fluorescent probes and the corresponding three-dimensional images constructed. Useful cytological information of the salt gland cells could also be obtained through the applications of fluorescent dyes (e.g., LysoTracker® Red, FM®4-64, Texas Red®. Conclusions The study of salt glands directly at the glandular level are made possible with the successful isolation of these specialized structures. Preparation of materials for subsequent microscopic

  1. Earthquake Early Warning: A Prospective User's Perspective (Invited)

    Science.gov (United States)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.

    2009-12-01

    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  2. Rapid, Robust Characterization of Subduction Zone Earthquakes

    Science.gov (United States)

    Irwin, Tisha Christine

    Energy is an important factor in international relations and recently the global energy paradigm has been seen to be shifting towards the East. In light of such change, a comparative assessment of the role of energy in Qatar' East Asian foreign relations will be conducted by taking China, Japan and South Korea as case studies. The research aimed to assess each of the bilateral relationship in terms of their origin and development in the energy sector generating an interpretation of their growing interdependence, taking into consideration the various domestic, regional and international influencing factors. At this level, LNG development and trade was adopted to see the extent of energy cooperation. In general, energy cooperation played the leading role in the three relationships, but to different degrees. Furthermore, all three bilateral relationship pertain to the 'complex interdependence approach' that is supported by the use of institutionalism and soft power.

  3. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  4. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  5. Disturbances in equilibrium function after major earthquake.

    Science.gov (United States)

    Honma, Motoyasu; Endo, Nobutaka; Osada, Yoshihisa; Kim, Yoshiharu; Kuriyama, Kenichi

    2012-01-01

    Major earthquakes were followed by a large number of aftershocks and significant outbreaks of dizziness occurred over a large area. However it is unclear why major earthquake causes dizziness. We conducted an intergroup trial on equilibrium dysfunction and psychological states associated with equilibrium dysfunction in individuals exposed to repetitive aftershocks versus those who were rarely exposed. Greater equilibrium dysfunction was observed in the aftershock-exposed group under conditions without visual compensation. Equilibrium dysfunction in the aftershock-exposed group appears to have arisen from disturbance of the inner ear, as well as individual vulnerability to state anxiety enhanced by repetitive exposure to aftershocks. We indicate potential effects of autonomic stress on equilibrium function after major earthquake. Our findings may contribute to risk management of psychological and physical health after major earthquakes with aftershocks, and allow development of a new empirical approach to disaster care after such events.

  6. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  7. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  8. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  9. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  10. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  11. Traffic Visualization

    DEFF Research Database (Denmark)

    Picozzi, Matteo; Verdezoto, Nervo; Pouke, Matti

    2013-01-01

    In this paper, we present a space-time visualization to provide city's decision-makers the ability to analyse and uncover important "city events" in an understandable manner for city planning activities. An interactive Web mashup visualization is presented that integrates several visualization...... techniques to give a rapid overview of traffic data. We illustrate our approach as a case study for traffic visualization systems, using datasets from the city of Oulu that can be extended to other city planning activities. We also report the feedback of real users (traffic management employees, traffic police...

  12. Detailed Visualization of Phase Evolution during Rapid Formation of Cu(InGa)Se2 Photovoltaic Absorber from Mo/CuGa/In/Se Precursors.

    Science.gov (United States)

    Koo, Jaseok; Kim, Sammi; Cheon, Taehoon; Kim, Soo-Hyun; Kim, Woo Kyoung

    2018-03-02

    Amongst several processes which have been developed for the production of reliable chalcopyrite Cu(InGa)Se 2 photovoltaic absorbers, the 2-step metallization-selenization process is widely accepted as being suitable for industrial-scale application. Here we visualize the detailed thermal behavior and reaction pathways of constituent elements during commercially attractive rapid thermal processing of glass/Mo/CuGa/In/Se precursors on the basis of the results of systematic characterization of samples obtained from a series of quenching experiments with set-temperatures between 25 and 550 °C. It was confirmed that the Se layer crystallized and then melted between 250 and 350 °C, completely disappearing at 500 °C. The formation of CuInSe 2 and Cu(InGa)Se 2 was initiated at around 450 °C and 550 °C, respectively. It is suggested that pre-heat treatment to control crystallization of Se layer should be designed at 250-350 °C and Cu(InGa)Se 2 formation from CuGa/In/Se precursors can be completed within a timeframe of 6 min.

  13. Rapid Visualization of Human Tumor Xenografts through Optical Imaging with a Near-Infrared Fluorescent Anti–Epidermal Growth Factor Receptor Nanobody

    Directory of Open Access Journals (Sweden)

    Sabrina Oliveira

    2012-01-01

    Full Text Available Given that overexpression of the epidermal growth factor receptor (EGFR is found in many types of human epithelial cancers, noninvasive molecular imaging of this receptor is of great interest. A number of studies have employed monoclonal antibodies as probes; however, their characteristic long half-life in the bloodstream has encouraged the development of smaller probes. In this study, an anti-EGFR nanobody-based probe was developed and tested in comparison with cetuximab for application in optical molecular imaging. To this aim, the anti-EGFR nanobody 7D12 and cetuximab were conjugated to the near-infrared fluorophore IRDye800CW. 7D12-IR allowed the visualization of tumors as early as 30 minutes postinjection, whereas with cetuximab-IR, no signal above background was observed at the tumor site. Quantification of the IR-conjugated proteins in the tumors revealed ≈ 17% of injected dose per gram 2 hours after injection of 7D12-IR, which was significantly higher than the tumor uptake obtained 24 hours after injection of cetuximab-IR. This difference is associated with the superior penetration and distribution of 7D12-IR within the tumor. These results demonstrate that this anti-EGFR nanobody conjugated to the NIR fluorophore has excellent properties for rapid preclinical optical imaging, which holds promise for its future use as a complementary diagnostic tool in humans.

  14. Anti-deception: reliable EEG-based biometrics with real-time capability from the neural response of face rapid serial visual presentation.

    Science.gov (United States)

    Wu, Qunjian; Yan, Bin; Zeng, Ying; Zhang, Chi; Tong, Li

    2018-05-03

    The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior invisibility, non-clonality, and non-coercion. In order to enhance its applicability in identity authentication, a novel EEG-based identity authentication method is proposed based on self- or non-self-face rapid serial visual presentation. In contrast to previous studies that extracted EEG features from rest state or motor imagery, the designed paradigm could obtain a distinct and stable biometric trait with a lower time cost. Channel selection was applied to select specific channels for each user to enhance system portability and improve discriminability between users and imposters. Two different imposter scenarios were designed to test system security, which demonstrate the capability of anti-deception. Fifteen users and thirty imposters participated in the experiment. The mean authentication accuracy values for the two scenarios were 91.31 and 91.61%, with 6 s time cost, which illustrated the precision and real-time capability of the system. Furthermore, in order to estimate the repeatability and stability of our paradigm, another data acquisition session is conducted for each user. Using the classification models generated from the previous sessions, a mean false rejected rate of 7.27% has been achieved, which demonstrates the robustness of our paradigm. Experimental results reveal that the proposed paradigm and methods are effective for EEG-based identity authentication.

  15. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  16. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  18. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  19. Performance of an Optimized Paper-Based Test for Rapid Visual Measurement of Alanine Aminotransferase (ALT in Fingerstick and Venipuncture Samples.

    Directory of Open Access Journals (Sweden)

    Sidhartha Jain

    Full Text Available A paper-based, multiplexed, microfluidic assay has been developed to visually measure alanine aminotransferase (ALT in a fingerstick sample, generating rapid, semi-quantitative results. Prior studies indicated a need for improved accuracy; the device was subsequently optimized using an FDA-approved automated platform (Abaxis Piccolo Xpress as a comparator. Here, we evaluated the performance of the optimized paper test for measurement of ALT in fingerstick blood and serum, as compared to Abaxis and Roche/Hitachi platforms. To evaluate feasibility of remote results interpretation, we also compared reading cell phone camera images of completed tests to reading the device in real time.96 ambulatory patients with varied baseline ALT concentration underwent fingerstick testing using the paper device; cell phone images of completed devices were taken and texted to a blinded off-site reader. Venipuncture serum was obtained from 93/96 participants for routine clinical testing (Roche/Hitachi; subsequently, 88/93 serum samples were captured and applied to paper and Abaxis platforms. Paper test and reference standard results were compared by Bland-Altman analysis.For serum, there was excellent agreement between paper test and Abaxis results, with negligible bias (+4.5 U/L. Abaxis results were systematically 8.6% lower than Roche/Hitachi results. ALT values in fingerstick samples tested on paper were systematically lower than values in paired serum tested on paper (bias -23.6 U/L or Abaxis (bias -18.4 U/L; a correction factor was developed for the paper device to match fingerstick blood to serum. Visual reads of cell phone images closely matched reads made in real time (bias +5.5 U/L.The paper ALT test is highly accurate for serum testing, matching the reference method against which it was optimized better than the reference methods matched each other. A systematic difference exists between ALT values in fingerstick and paired serum samples, and can be

  20. The contribution of discrete-trial naming and visual recognition to rapid automatized naming deficits of dyslexic children with and without a history of language delay

    Directory of Open Access Journals (Sweden)

    Filippo eGasperini

    2014-09-01

    Full Text Available Children with Developmental Dyslexia (DD are impaired in Rapid Automatized Naming (RAN tasks, where subjects are asked to name arrays of high frequency items as quickly as possible. However the reasons why RAN speed discriminates DD from typical readers are not yet fully understood. Our study was aimed to identify some of the cognitive mechanisms underlying RAN-reading relationship by comparing one group of 32 children with DD with an age-matched control group of typical readers on a naming and a visual recognition task both using a discrete-trial methodology , in addition to a serial RAN task, all using the same stimuli (digits and colors. Results showed a significant slowness of DD children in both serial and discrete-trial naming tasks regardless of type of stimulus, but no difference between the two groups on the discrete-trial recognition task. Significant differences between DD and control participants in the RAN task disappeared when performance in the discrete-trial naming task was partialled out by covariance analysis for colors, but not for digits. The same pattern held in a subgroup of DD subjects with a history of early language delay (LD. By contrast, in a subsample of DD children without LD the RAN deficit was specific for digits and disappeared after slowness in discrete-trial naming was partialled out. Slowness in discrete-trial naming was more evident for LD than for noLD DD children. Overall, our results confirm previous evidence indicating a name-retrieval deficit as a cognitive impairment underlying RAN slowness in DD children. This deficit seems to be more marked in DD children with previous LD. Moreover, additional cognitive deficits specifically associated with serial RAN tasks have to be taken into account when explaining deficient RAN speed of these latter children. We suggest that partially different cognitive dysfunctions underpin superficially similar RAN impairments in different subgroups of DD subjects.

  1. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Science.gov (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  2. Design and Development of Low Cost, Simple, Rapid and Safe, Modified Field Kits for the Visual Detection and Determination of Arsenic in Drinking Water Samples

    Directory of Open Access Journals (Sweden)

    Y. Anjaneyulu

    2005-08-01

    (HgBr 2 As (10-50ppb, Brown – (HgBr 3 As (50-100ppb or Black – Hg3 As2 (>100ppb are formed which can be precisely estimated by visual comparison with standard color chart. The results obtained by field kits agree well with the data obtained through I.C.P.AES methods. The most important characteristic for field measurement is that analytical results can be obtained on the site where the sample is taken with high precision and can be conveniently utilized for monitoring arsenic rapidly in a highly contaminated large geographical area.

  3. [Comparative analysis of the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquakes].

    Science.gov (United States)

    Shi, Xiao-Jun; Wang, Guang-Lin; Pei, Fu-Xing; Song, Yue-Ming; Yang, Tian-Fu; Tu, Chong-Qi; Huang, Fu-Guo; Liu, Hao; Lin, Wei

    2013-10-18

    To systematically analyze and compare the clinical characteristics of orthopedic inpatients in Lushan and Wenchuan earthquake, so as to provide useful references for future earthquakes injury rescue. Based on the orthopedic inpatients in Lushan and Wenchuan earthquakes, the data of the age, gender, injury causes, body injured parts and speed of transport were classified and compared. The duration of patients admitted to hospital lasted long and the peak appeared late in Wenchuan earthquake, which is totally opposed to Lushan earthquake. There was no significant difference in the patient's age and gender between the two earthquakes. However, the occurrence rate of crush syndrome, amputation, gas gangrene, vascular injury and multiple organ dysfunction syndrome (MODS) in Wenchuan earthquake was much higher than that in Lushan earthquake. Blunt traumas or crush-related injuries (79.6%) are the major injury cause in Wenchuan earthquake, however, high falling injuries and falls (56.8%) are much higher than blunt trauma or crush-related injuries (39.2%) in Lushan earthquake. The incidence rate of foot fractures, spine fractures and multiple fractures in Lushan earthquake was higher than that in Wenchuan earthquake, but that of open fractures and lower limb fractures was lower than that in Wenchuan earthquake. The rapid rescue scene is the cornerstone of successful treatment, early rescue and transport obviously reduce the incidence of the wound infection, crush syndrome, MODS and amputation. Popularization of correct knowledge of emergency shelters will help to reduce the damage caused by blindly jumping or escaping while earthquake happens.

  4. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  5. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  6. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  7. Toward standardization of slow earthquake catalog -Development of database website-

    Science.gov (United States)

    Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.

    2017-12-01

    Slow earthquakes have now been widely discovered in the world based on the recent development of geodetic and seismic observations. Many researchers detect a wide frequency range of slow earthquakes including low frequency tremors, low frequency earthquakes, very low frequency earthquakes and slow slip events by using various methods. Catalogs of the detected slow earthquakes are open to us in different formats by each referring paper or through a website (e.g., Wech 2010; Idehara et al. 2014). However, we need to download catalogs from different sources, to deal with unformatted catalogs and to understand the characteristics of different catalogs, which may be somewhat complex especially for those who are not familiar with slow earthquakes. In order to standardize slow earthquake catalogs and to make such a complicated work easier, Scientific Research on Innovative Areas "Science of Slow Earthquakes" has been developing a slow earthquake catalog website. In the website, we can plot locations of various slow earthquakes via the Google Maps by compiling a variety of slow earthquake catalogs including slow slip events. This enables us to clearly visualize spatial relations among slow earthquakes at a glance and to compare the regional activities of slow earthquakes or the locations of different catalogs. In addition, we can download catalogs in the unified format and refer the information on each catalog on the single website. Such standardization will make it more convenient for users to utilize the previous achievements and to promote research on slow earthquakes, which eventually leads to collaborations with researchers in various fields and further understanding of the mechanisms, environmental conditions, and underlying physics of slow earthquakes. Furthermore, we expect that the website has a leading role in the international standardization of slow earthquake catalogs. We report the overview of the website and the progress of construction. Acknowledgment: This

  8. Implications of fault constitutive properties for earthquake prediction.

    Science.gov (United States)

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  9. Improving PAGER's real-time earthquake casualty and loss estimation toolkit: a challenge

    Science.gov (United States)

    Jaiswal, K.S.; Wald, D.J.

    2012-01-01

    We describe the on-going developments of PAGER’s loss estimation models, and discuss value-added web content that can be generated related to exposure, damage and loss outputs for a variety of PAGER users. These developments include identifying vulnerable building types in any given area, estimating earthquake-induced damage and loss statistics by building type, and developing visualization aids that help locate areas of concern for improving post-earthquake response efforts. While detailed exposure and damage information is highly useful and desirable, significant improvements are still necessary in order to improve underlying building stock and vulnerability data at a global scale. Existing efforts with the GEM’s GED4GEM and GVC consortia will help achieve some of these objectives. This will benefit PAGER especially in regions where PAGER’s empirical model is less-well constrained; there, the semi-empirical and analytical models will provide robust estimates of damage and losses. Finally, we outline some of the challenges associated with rapid casualty and loss estimation that we experienced while responding to recent large earthquakes worldwide.

  10. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  11. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  12. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  13. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  14. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  15. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  16. The plan to coordinate NEHRP post-earthquake investigations

    Science.gov (United States)

    Holzer, Thomas L.; Borcherdt, Roger D.; Comartin, Craig D.; Hanson, Robert D.; Scawthorn, Charles R.; Tierney, Kathleen; Youd, T. Leslie

    2003-01-01

    This is the plan to coordinate domestic and foreign post-earthquake investigations supported by the National Earthquake Hazards Reduction Program (NEHRP). The plan addresses coordination of both the NEHRP agencies—Federal Emergency Management Agency (FEMA), National Institute of Standards and Technology (NIST), National Science Foundation (NSF), and U. S. Geological Survey (USGS)—and their partners. The plan is a framework for both coordinating what is going to be done and identifying responsibilities for post-earthquake investigations. It does not specify what will be done. Coordination is addressed in various time frames ranging from hours to years after an earthquake. The plan includes measures for (1) gaining rapid and general agreement on high-priority research opportunities, and (2) conducting the data gathering and fi eld studies in a coordinated manner. It deals with identifi cation, collection, processing, documentation, archiving, and dissemination of the results of post-earthquake work in a timely manner and easily accessible format.

  17. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    Science.gov (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  18. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  19. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  20. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  1. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  2. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  3. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  4. Listening to data from the 2011 magnitude 9.0 Tohoku-Oki, Japan, earthquake

    Science.gov (United States)

    Peng, Z.; Aiken, C.; Kilb, D. L.; Shelly, D. R.; Enescu, B.

    2011-12-01

    It is important for seismologists to effectively convey information about catastrophic earthquakes, such as the magnitude 9.0 earthquake in Tohoku-Oki, Japan, to general audience who may not necessarily be well-versed in the language of earthquake seismology. Given recent technological advances, previous approaches of using "snapshot" static images to represent earthquake data is now becoming obsolete, and the favored venue to explain complex wave propagation inside the solid earth and interactions among earthquakes is now visualizations that include auditory information. Here, we convert seismic data into visualizations that include sounds, the latter being a term known as 'audification', or continuous 'sonification'. By combining seismic auditory and visual information, static "snapshots" of earthquake data come to life, allowing pitch and amplitude changes to be heard in sync with viewed frequency changes in the seismograms and associated spectragrams. In addition, these visual and auditory media allow the viewer to relate earthquake generated seismic signals to familiar sounds such as thunder, popcorn popping, rattlesnakes, firecrackers, etc. We present a free software package that uses simple MATLAB tools and Apple Inc's QuickTime Pro to automatically convert seismic data into auditory movies. We focus on examples of seismic data from the 2011 Tohoku-Oki earthquake. These examples range from near-field strong motion recordings that demonstrate the complex source process of the mainshock and early aftershocks, to far-field broadband recordings that capture remotely triggered deep tremor and shallow earthquakes. We envision audification of seismic data, which is geared toward a broad range of audiences, will be increasingly used to convey information about notable earthquakes and research frontiers in earthquake seismology (tremor, dynamic triggering, etc). Our overarching goal is that sharing our new visualization tool will foster an interest in seismology, not

  5. PERFORMANCE EVALUATION OF THE IRD RSUP DR. SARDJITO BUILDING TO THE INFLUENCE OF EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    T. S.M. Aritonang

    2012-02-01

    Full Text Available In performance-based design, the level of performance of hospital buildings is generally operational, where the buildings are expected to continue to function after the occurrence of the earthquake and do not undergo significant damage. This research evaluates the level of performance suitability of the Emergency Care Installation Buildings of Dr. Sardjito Hospital (hereinafter referred to as IRD RSUP Dr. Sardjito building for the effects of earthquakes. First evaluation is done by Rapid Visual Screening by FEMA 154 (2002, then continued with more detailed evaluation based on FEMA 310 (1998. The building structure is modeled by SAP2000 and created in 2 models, i.e. the Models with wall and the Models without wall. The earthquake loads refers to SNI 1726-2002. The strength of structural elements is calculated with SNI 2847-2002 and Response-2000. For evaluation of structure performance levels, a pushover analysis used for nonlinear procedures, where the analysis used to the Seismicity region 3 and 4. Performance point is determined by Capacity Spectrum Method based on ATC-40 (1996, which has built-in in the SAP2000 Program. The performance level of the building is determined by drift ratio criteria required by FEMA 356 (2000 as well as ATC-40 (1996. From the research results, it is obtained that the natural period for the Model with wall is 0.592 seconds and 1.687 Hz frequency, and natural period for Model without wall is 1.291 seconds and 0.774 Hz frequency. Therefore, the level of structure performances for earthquake return period of 500 years is immediate occupancy.

  6. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  7. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  8. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  9. Keeping focus on earthquakes at school for seismic risk mitigation of the next generations

    Science.gov (United States)

    Saraò, Angela; Barnaba, Carla; Peruzza, Laura

    2013-04-01

    The knowledge of the seismic history of its own territory, the understanding of physical phenomena in response to an earthquake, the changes in the cultural heritage following a strong earthquake, the learning of actions to be taken during and after an earthquake, are piece of information that contribute to keep focus on the seismic hazard and to implement strategies for seismic risk mitigation. The training of new generations, today more than ever subject to rapid forgetting of past events, becomes therefore a key element to increase the perception that earthquakes happened and can happen at anytime and that mitigation actions are the only means to ensure the safety and to reduce damages and human losses. Since several years our institute (OGS) is involved in activities to raise awareness of education on earthquake. We aim to implement education programs with the goal of addressing a critical approach to seismic hazard reduction, differentiating the types of activities according to the age of the students. However, being such kind of activity unfunded, we can act at now only on a very limited number of schools per year. To be effective, the inclusion of the seismic risk issues in school curricula requires specific time and appropriate approaches when planning activities. For this reason, we involve also the teachers as proponents of activities and we encourage them to keep alive memories and discussion on earthquake in the classes. During the past years we acted mainly in the schools of the Friuli Venezia Giulia area (NE Italy), that is an earthquake prone area struck in 1976 by a destructive seismic event (Ms=6.5). We organized short training courses for teachers, we lectured classes, and we led laboratory activities with students. Indeed, being well known that students enjoy classes more when visual and active learning are joined, we propose a program that is composed by seminars, demonstrations and hands-on activities in the classrooms; for high school students

  10. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  11. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  12. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  13. A Comparison of ectoparasite infestation by chigger mite larvae (Acarina: Trombiculidae) on resident and migratory birds in Chiapas, Mexico illustrating a rapid visual assessment protocol

    Science.gov (United States)

    Thomas V. Dietsch

    2005-01-01

    This study presents a protocol developed to rapidly assess ectoparasite prevalence and intensity. Using this protocol during a mist-netting project in two different coffee agroecosystems in Chiapas, Mexico, data were collected on ectoparasitic chigger mite larvae (Acarina: Trombiculidae) found on resident and migratory birds. Surprisingly high infestation rates were...

  14. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  15. Earthquake Early Warning: User Education and Designing Effective Messages

    Science.gov (United States)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  16. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  17. Detection of carbapenemase activity in Enterobacteriaceae using LC-MS/MS in comparison with the neo-rapid CARB kit using direct visual assessment and colorimetry.

    Science.gov (United States)

    Huber, Charlotte A; Sidjabat, Hanna E; Zowawi, Hosam M; Kvaskoff, David; Reed, Sarah; McNamara, John F; McCarthy, Kate L; Harris, Patrick; Toh, Benjamin; Wailan, Alexander M; Paterson, David L

    2016-12-01

    It has been described that the sensitivity of the Carba NP test may be low in the case of OXA-48-like carbapenamases and mass spectrometry based methods as well as a colorimetry based method have been described as alternatives. We evaluated 84 Enterobacteriaceae isolates including 31 OXA-48-like producing isolates and 13 isolates that produced either an imipenemase (IMP; n=8), New Delhi metallo-β-lactamase (NDM; n=3), or Klebsiella pneumoniae carbapenemase (KPC; n=2), as well as 40 carbapenemase negative Enterobacteriaceae isolates. We used the Neo-Rapid CARB kit, assessing the results with the unaided eye and compared it with a colorimetric approach. Furthermore, we incubated the isolates in growth media with meropenem and measured the remaining meropenem after one and 2h of incubation, respectively, using liquid chromatography tandem mass spectrometry (LC-MS/MS). Whilst all carbapenemase producing isolates with the exception of the OXA-244 producer tested positive for both the Neo-rapid CARB test using the unaided eye or colorimetry, and the 13 isolates producing either IMP, NDM or KPC hydrolysed the meropenem in the media almost completely after 2h of incubation, the 31 OXA-48-like producing isolates exhibited very variable hydrolytic activity when incubated in growth media with meropenem. In our study, the Neo-Rapid CARB test yielded a sensitivity of 98% for both the traditional and the colorimetric approach with a specificity of 95% and 100% respectively. Our results indicate that the Neo-Rapid CARB test may have use for the detection of OXA-48 type carbapenemases and that it may be particularly important to ensure bacterial lysis for the detection of these weaker hydrolysers. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  19. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    Science.gov (United States)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  20. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  1. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  2. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  3. Vision first? The development of primary visual cortical networks is more rapid than the development of primary motor networks in humans.

    Directory of Open Access Journals (Sweden)

    Patricia Gervan

    Full Text Available The development of cortical functions and the capacity of the mature brain to learn are largely determined by the establishment and maintenance of neocortical networks. Here we address the human development of long-range connectivity in primary visual and motor cortices, using well-established behavioral measures--a Contour Integration test and a Finger-tapping task--that have been shown to be related to these specific primary areas, and the long-range neural connectivity within those. Possible confounding factors, such as different task requirements (complexity, cognitive load are eliminated by using these tasks in a learning paradigm. We find that there is a temporal lag between the developmental timing of primary sensory vs. motor areas with an advantage of visual development; we also confirm that human development is very slow in both cases, and that there is a retained capacity for practice induced plastic changes in adults. This pattern of results seems to point to human-specific development of the "canonical circuits" of primary sensory and motor cortices, probably reflecting the ecological requirements of human life.

  4. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  5. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  6. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  7. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  8. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  9. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  10. [Earthquakes in El Salvador].

    Science.gov (United States)

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  11. Population-based assessment of prevalence and causes of visual impairment in the state of Telangana, India: a cross-sectional study using the Rapid Assessment of Visual Impairment (RAVI) methodology.

    Science.gov (United States)

    Marmamula, Srinivas; Khanna, Rohit C; Kunkunu, Eswararao; Rao, Gullapalli N

    2016-12-15

    To assess the prevalence and causes of visual impairment (VI) among a rural population aged 40 years and older in the state of Telangana in India. Population-based cross-sectional study. Districts of Adilabad and Mahbubnagar in south Indian state of Telangana, India. A sample of 6150 people was selected using cluster random sampling methodology. A team comprising a trained vision technician and a field worker visited the households and conducted the eye examination. Presenting, pinhole and aided visual acuity were assessed. Anterior segment was examined using a torchlight. Lens was examined using distant direct ophthalmoscopy in a semidark room. In all, 5881 (95.6%) participants were examined from 123 study clusters. Among those examined, 2723 (46.3%) were men, 4824 (82%) had no education, 2974 (50.6%) were from Adilabad district and 1694 (28.8%) of them were using spectacles at the time of eye examination. VI was defined as presenting visual acuity <6/18 in the better eye and it included moderate VI (<6/18 to 6/60) and blindness (<6/60). The age-adjusted and gender-adjusted prevalence of VI was 15.0% (95% CI 14.1% to 15.9%). On applying binary logistic regression analysis, VI was associated with older age groups. The odds of having VI were higher among women (OR 1.2; 95% CI 1.0 to 1.4). Having any education (OR 0.4; 95% CI 0.3 to 0.6) and current use of glasses (OR 0.19; 95% CI 0.1 to 0.2) were protective. VI was also higher in Mahbubnagar (OR 1.0 to 1.5) district. Cataract (54.7%) was the leading cause of VI followed by uncorrected refractive errors (38.6%). VI continues to remain a challenge in rural Telangana. As over 90% of the VI is avoidable, massive eye care programmes are required to address the burden of VI in Telangana. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  12. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  13. An interdisciplinary approach for earthquake modelling and forecasting

    Science.gov (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  14. What is the earthquake fracture energy?

    Science.gov (United States)

    Di Toro, G.; Nielsen, S. B.; Passelegue, F. X.; Spagnuolo, E.; Bistacchi, A.; Fondriest, M.; Murphy, S.; Aretusini, S.; Demurtas, M.

    2016-12-01

    The energy budget of an earthquake is one of the main open questions in earthquake physics. During seismic rupture propagation, the elastic strain energy stored in the rock volume that bounds the fault is converted into (1) gravitational work (relative movement of the wall rocks bounding the fault), (2) in- and off-fault damage of the fault zone rocks (due to rupture propagation and frictional sliding), (3) frictional heating and, of course, (4) seismic radiated energy. The difficulty in the budget determination arises from the measurement of some parameters (e.g., the temperature increase in the slipping zone which constraints the frictional heat), from the not well constrained size of the energy sinks (e.g., how large is the rock volume involved in off-fault damage?) and from the continuous exchange of energy from different sinks (for instance, fragmentation and grain size reduction may result from both the passage of the rupture front and frictional heating). Field geology studies, microstructural investigations, experiments and modelling may yield some hints. Here we discuss (1) the discrepancies arising from the comparison of the fracture energy measured in experiments reproducing seismic slip with the one estimated from seismic inversion for natural earthquakes and (2) the off-fault damage induced by the diffusion of frictional heat during simulated seismic slip in the laboratory. Our analysis suggests, for instance, that the so called earthquake fracture energy (1) is mainly frictional heat for small slips and (2), with increasing slip, is controlled by the geometrical complexity and other plastic processes occurring in the damage zone. As a consequence, because faults are rapidly and efficiently lubricated upon fast slip initiation, the dominant dissipation mechanism in large earthquakes may not be friction but be the off-fault damage due to fault segmentation and stress concentrations in a growing region around the fracture tip.

  15. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  16. Human casualties in earthquakes: Modelling and mitigation

    Science.gov (United States)

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  17. Novel Polymerase Spiral Reaction (PSR) for rapid visual detection of Bovine Herpesvirus 1 genomic DNA from aborted bovine fetus and semen.

    Science.gov (United States)

    Malla, Javed Ahmed; Chakravarti, Soumendu; Gupta, Vikas; Chander, Vishal; Sharma, Gaurav Kumar; Qureshi, Salauddin; Mishra, Adhiraj; Gupta, Vivek Kumar; Nandi, Sukdeb

    2018-02-20

    Bovine herpesvirus-1 (BHV-1) is a major viral pathogen affecting bovines leading to various clinical manifestations and causes significant economic impediment in modern livestock production system. Rapid, accurate and sensitive detection of BHV-1 infection at frozen semen stations or at dairy herds remains a priority for control of BHV-1 spread to susceptible population. Polymerase Spiral Reaction (PSR), a novel addition in the gamut of isothermal techniques, has been successfully implemented in initial optimization for detection of BHV-1 genomic DNA and further validated in clinical samples. The developed PSR assay has been validated for detection of BHV-1 from bovine semen (n=99), a major source of transmission of BHV-1 from breeding bulls to susceptible dams in artificial insemination programs. The technique has also been used for screening of BHV-1 DNA from suspected aborted fetal tissues (n=25). The developed PSR technique is 100 fold more sensitive than conventional PCR and comparable to real-time PCR. The PSR technique has been successful in detecting 13 samples positive for BHV-1 DNA in bovine semen, 4 samples more than conventional PCR. The aborted fetal tissues were negative for presence of BHV-1 DNA. The presence of BHV-1 in bovine semen samples raises a pertinent concern for extensively screening of semen from breeding bulls before been used for artificial insemination process. PSR has all the attributes for becoming a method of choice for rapid, accurate and sensitive detection of BHV-1 DNA at frozen semen stations or at dairy herds in resource constrained settings. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. One simple DNA extraction device and its combination with modified visual loop-mediated isothermal amplification for rapid on-field detection of genetically modified organisms.

    Science.gov (United States)

    Zhang, Miao; Liu, Yinan; Chen, Lili; Quan, Sheng; Jiang, Shimeng; Zhang, Dabing; Yang, Litao

    2013-01-02

    Quickness, simplicity, and effectiveness are the three major criteria for establishing a good molecular diagnosis method in many fields. Herein we report a novel detection system for genetically modified organisms (GMOs), which can be utilized to perform both on-field quick screening and routine laboratory diagnosis. In this system, a newly designed inexpensive DNA extraction device was used in combination with a modified visual loop-mediated isothermal amplification (vLAMP) assay. The main parts of the DNA extraction device included a silica gel membrane filtration column and a modified syringe. The DNA extraction device could be easily operated without using other laboratory instruments, making it applicable to an on-field GMO test. High-quality genomic DNA (gDNA) suitable for polymerase chain reaction (PCR) and isothermal amplification could be quickly isolated from plant tissues using this device within 15 min. In the modified vLAMP assay, a microcrystalline wax encapsulated detection bead containing SYBR green fluorescent dye was introduced to avoid dye inhibition and cross-contaminations from post-LAMP operation. The system was successfully applied and validated in screening and identification of GM rice, soybean, and maize samples collected from both field testing and the Grain Inspection, Packers, and Stockyards Administration (GIPSA) proficiency test program, which demonstrated that it was well-adapted to both on-field testing and/or routine laboratory analysis of GMOs.

  19. Earthquake simulations with time-dependent nucleation and long-range interactions

    Directory of Open Access Journals (Sweden)

    J. H. Dieterich

    1995-01-01

    Full Text Available A model for rapid simulation of earthquake sequences is introduced which incorporates long-range elastic interactions among fault elements and time-dependent earthquake nucleation inferred from experimentally derived rate- and state-dependent fault constitutive properties. The model consists of a planar two-dimensional fault surface which is periodic in both the x- and y-directions. Elastic interactions among fault elements are represented by an array of elastic dislocations. Approximate solutions for earthquake nucleation and dynamics of earthquake slip are introduced which permit computations to proceed in steps that are determined by the transitions from one sliding state to the next. The transition-driven time stepping and avoidance of systems of simultaneous equations permit rapid simulation of large sequences of earthquake events on computers of modest capacity, while preserving characteristics of the nucleation and rupture propagation processes evident in more detailed models. Earthquakes simulated with this model reproduce many of the observed spatial and temporal characteristics of clustering phenomena including foreshock and aftershock sequences. Clustering arises because the time dependence of the nucleation process is highly sensitive to stress perturbations caused by nearby earthquakes. Rate of earthquake activity following a prior earthquake decays according to Omori's aftershock decay law and falls off with distance.

  20. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  1. Guidelines for nuclear plant response to an earthquake

    International Nuclear Information System (INIS)

    1989-12-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. The objectives of the earthquake response procedures are to determine (1) the immediate effects of an earthquake on the physical condition of the nuclear power plant, (2) if shutdown of the plant is appropriate based on the observed damage to the plant or because the OBE has been exceeded, and (3) the readiness of the plant to resume operation following shutdown due to an earthquake. Readiness of a nuclear power plant to restart is determined on the basis of visual inspections of nuclear plant equipment and structures, and the successful completion of surveillance tests which demonstrate that the limiting conditions for operation as defined in the plant Technical Specifications are met. The guidelines are based on information obtained from a review of earthquake response procedures from numerous US and foreign nuclear power plants, interviews with nuclear plant operations personnel, and a review of reports of damage to industrial equipment and structures in actual earthquakes. 7 refs., 4 figs., 4 tabs

  2. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC

    Science.gov (United States)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.

    2017-12-01

    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  3. Fighting and preventing post-earthquake fires in nuclear power plant

    International Nuclear Information System (INIS)

    Lu Xuefeng; Zhang Xin

    2011-01-01

    Nuclear power plant post-earthquake fires will cause not only personnel injury, severe economic loss, but also serious environmental pollution. For the moment, nuclear power is in a position of rapid development in China. Considering the earthquake-prone characteristics of our country, it is of great engineering importance to investigate the nuclear power plant post-earthquake fires. This article analyzes the cause, influential factors and development characteristics of nuclear power plant post-earthquake fires in details, and summarizes the three principles should be followed in fighting and preventing nuclear power plant post-earthquake fires, such as solving problems in order of importance and urgency, isolation prior to prevention, immediate repair and regular patrol. Three aspects were pointed out that should be paid attention in fighting and preventing post-earthquake fires. (authors)

  4. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  5. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  6. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone

    Science.gov (United States)

    Parsons, Tom

    2002-09-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ˜39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ˜7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  7. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    economic repercussion. We provide the school kids with the "World Seismicity Map" to let them realize that earthquake disasters take place unequally. Then we let the kids jump in front of the seismometer with projecting the real-time data to the wall. Grouped kids contest the largest amplitude by carefully considering how to jump high but nail the landing with their teammates. Their jumps are printed out via portable printer and compared with the real earthquake which occurred even 600km away but still huge when printed out in the same scale. Actually, a magnitude 7 earthquake recorded 600km away needs an A0 paper when scaled with a jump of 10 kids printed in an A4 paper. They've got to understand what to do not to be killed with the great big energy. We also offer earthquake drills using the Earthquake Early Warning System (EEW System). An EEW System is officially introduced in 2007 by JMA (Japan Meteorological Agency) to issue prompt alerts to provide several to several ten seconds before S-wave arrives. When hearing the alarm, school kids think fast to find a place to protect themselves. It is not always when they are in their classrooms but in the chemical lab, music room which does not have any desks to protect them, or in the PE class. Then in the science class, we demonstrate how the EEW System works. A 8m long wave propagation device made with spindles connected with springs is used to visualize the P- and S-waves. In the presentation, we would like to show the paper materials and sufficient movies.

  8. Fire and earthquake counter measures in radiation handling facilities

    International Nuclear Information System (INIS)

    1985-01-01

    'Fire countermeasures in radiation handling facilities' published in 1961 is still widely utilized as a valuable guideline for those handling radiation through the revision in 1972. However, science and technology rapidly advanced, and the relevant laws were revised after the publication, and many points which do not conform to the present state have become to be found. Therefore, it was decided to rewrite this book, and the new book has been completed. The title was changed to 'Fire and earthquake countermeasures in radiation handling facilities', and the countermeasures to earthquakes were added. Moreover, consideration was given so that the book is sufficiently useful also for those concerned with fire fighting, not only for those handling radiation. In this book, the way of thinking about the countermeasures against fires and earthquakes, the countermeasures in normal state and when a fire or an earthquake occurred, the countermeasures when the warning declaration has been announced, and the data on fires, earthquakes, the risk of radioisotopes, fire fighting equipment, the earthquake counter measures for equipment, protectors and radiation measuring instruments, first aid, the example of emergency system in radiation handling facilities, the activities of fire fighters, the example of accidents and so on are described. (Kako, I.)

  9. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  10. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  11. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    Science.gov (United States)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  12. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  13. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  14. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  15. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    Science.gov (United States)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  16. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  17. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  18. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  19. Housing Damage Following Earthquake

    Science.gov (United States)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  20. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  1. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  2. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  3. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  4. Immersive Visual Data Analysis For Geoscience Using Commodity VR Hardware

    Science.gov (United States)

    Kreylos, O.; Kellogg, L. H.

    2017-12-01

    Immersive visualization using virtual reality (VR) display technology offers tremendous benefits for the visual analysis of complex three-dimensional data like those commonly obtained from geophysical and geological observations and models. Unlike "traditional" visualization, which has to project 3D data onto a 2D screen for display, VR can side-step this projection and display 3D data directly, in a pseudo-holographic (head-tracked stereoscopic) form, and does therefore not suffer the distortions of relative positions, sizes, distances, and angles that are inherent in 2D projection. As a result, researchers can apply their spatial reasoning skills to virtual data in the same way they can to real objects or environments. The UC Davis W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES, http://keckcaves.org) has been developing VR methods for data analysis since 2005, but the high cost of VR displays has been preventing large-scale deployment and adoption of KeckCAVES technology. The recent emergence of high-quality commodity VR, spearheaded by the Oculus Rift and HTC Vive, has fundamentally changed the field. With KeckCAVES' foundational VR operating system, Vrui, now running natively on the HTC Vive, all KeckCAVES visualization software, including 3D Visualizer, LiDAR Viewer, Crusta, Nanotech Construction Kit, and ProtoShop, are now available to small labs, single researchers, and even home users. LiDAR Viewer and Crusta have been used for rapid response to geologic events including earthquakes and landslides, to visualize the impacts of sealevel rise, to investigate reconstructed paleooceanographic masses, and for exploration of the surface of Mars. The Nanotech Construction Kit is being used to explore the phases of carbon in Earth's deep interior, while ProtoShop can be used to construct and investigate protein structures.

  5. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  6. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  7. Quasi real-time estimation of the moment magnitude of large earthquake from static strain changes

    Science.gov (United States)

    Itaba, S.

    2016-12-01

    The 2011 Tohoku-Oki (off the Pacific coast of Tohoku) earthquake, of moment magnitude 9.0, was accompanied by large static strain changes (10-7), as measured by borehole strainmeters operated by the Geological Survey of Japan in the Tokai, Kii Peninsula, and Shikoku regions. A fault model for the earthquake on the boundary between the Pacific and North American plates, based on these borehole strainmeter data, yielded a moment magnitude of 8.7. On the other hand, based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency (JMA) announced just after earthquake occurrence was 7.9. Such geodetic moment magnitudes, derived from static strain changes, can be estimated almost as rapidly as determinations using seismic waves. I have to verify the validity of this method in some cases. In the case of this earthquake's largest aftershock, which occurred 29 minutes after the mainshock. The prompt report issued by JMA assigned this aftershock a magnitude of 7.3, whereas the moment magnitude derived from borehole strain data is 7.6, which is much closer to the actual moment magnitude of 7.7. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami. Our simple method of using static strain changes is one of the strong methods for rapid estimation of the magnitude of large earthquakes, and useful to improve the accuracy of Earthquake Early Warning.

  8. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  9. Estimating shaking-induced casualties and building damage for global earthquake events: a proposed modelling approach

    Science.gov (United States)

    So, Emily; Spence, Robin

    2013-01-01

    Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.

  10. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    Science.gov (United States)

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  11. Visual languages and applications

    CERN Document Server

    Zhang, Kang

    2010-01-01

    Visual languages have long been a pursuit of effective communication between human and machine. With rapid advances of the Internet and Web technology, human-human communication through the Web or electronic mobile devices is becoming more and more prevalent. Visual Languages and Applications is a comprehensive introduction to diagrammatical visual languages. This book discusses what visual programming languages are, and how such languages and their underlying foundations can be usefully applied to other fields in computer science. It also covers a broad range of contents from the underlying t

  12. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  13. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  14. Development of fragility functions to estimate homelessness after an earthquake

    Science.gov (United States)

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann

    2014-05-01

    Immediately after an earthquake, many stakeholders need to make decisions about their response. These decisions often need to be made in a data poor environment as accurate information on the impact can take months or even years to be collected and publicized. Social fragility functions have been developed and applied to provide an estimate of the impact in terms of building damage, deaths and injuries in near real time. These rough estimates can help governments and response agencies determine what aid may be required which can improve their emergency response and facilitate planning for longer term response. Due to building damage, lifeline outages, fear of aftershocks, or other causes, people may become displaced or homeless after an earthquake. Especially in cold and dangerous locations, the rapid provision of safe emergency shelter can be a lifesaving necessity. However, immediately after an event there is little information available about the number of homeless, their locations and whether they require public shelter to aid the response agencies in decision making. In this research, we analyze homelessness after historic earthquakes using the CATDAT Damaging Earthquakes Database. CATDAT includes information on the hazard as well as the physical and social impact of over 7200 damaging earthquakes from 1900-2013 (Daniell et al. 2011). We explore the relationship of both earthquake characteristics and area characteristics with homelessness after the earthquake. We consider modelled variables such as population density, HDI, year, measures of ground motion intensity developed in Daniell (2014) over the time period from 1900-2013 as well as temperature. Using a base methodology based on that used for PAGER fatality fragility curves developed by Jaiswal and Wald (2010), but using regression through time using the socioeconomic parameters developed in Daniell et al. (2012) for "socioeconomic fragility functions", we develop a set of fragility curves that can be

  15. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  16. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  17. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  18. Automated Determination of Magnitude and Source Length of Large Earthquakes

    Science.gov (United States)

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.

    2017-12-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  19. Rapid world modeling

    International Nuclear Information System (INIS)

    Little, Charles; Jensen, Ken

    2002-01-01

    Sandia National Laboratories has designed and developed systems capable of large-scale, three-dimensional mapping of unstructured environments in near real time. This mapping technique is called rapid world modeling and has proven invaluable when used by prototype systems consisting of sensory detection devices mounted on mobile platforms. These systems can be deployed into previously unmapped environments and transmit real-time 3-D visual images to operators located remotely. This paper covers a brief history of the rapid world modeling system, its implementation on mobile platforms, and the current state of the technology. Applications to the nuclear power industry are discussed. (author)

  20. Earthquake and Tsunami Disaster Mitigation in the Marmara Region and Disaster Education in Turkey Part3

    Science.gov (United States)

    Kaneda, Yoshiyuki; Ozener, Haluk; Meral Ozel, Nurcan; Kalafat, Dogan; Ozgur Citak, Seckin; Takahashi, Narumi; Hori, Takane; Hori, Muneo; Sakamoto, Mayumi; Pinar, Ali; Oguz Ozel, Asim; Cevdet Yalciner, Ahmet; Tanircan, Gulum; Demirtas, Ahmet

    2017-04-01

    There have been many destructive earthquakes and tsunamis in the world.The recent events are, 2011 East Japan Earthquake/Tsunami in Japan, 2015 Nepal Earthquake and 2016 Kumamoto Earthquake in Japan, and so on. And very recently a destructive earthquake occurred in Central Italy. In Turkey, the 1999 Izmit Earthquake as the destructive earthquake occurred along the North Anatolian Fault (NAF). The NAF crosses the Sea of Marmara and the only "seismic gap" remains beneath the Sea of Marmara. Istanbul with high population similar to Tokyo in Japan, is located around the Sea of Marmara where fatal damages expected to be generated as compound damages including Tsunami and liquefaction, when the next destructive Marmara Earthquake occurs. The seismic risk of Istanbul seems to be under the similar risk condition as Tokyo in case of Nankai Trough earthquake and metropolitan earthquake. It was considered that Japanese and Turkish researchers can share their own experiences during past damaging earthquakes and can prepare for the future large earthquakes in cooperation with each other. Therefore, in 2013 the two countries, Japan and Turkey made an agreement to start a multidisciplinary research project, MarDiM SATREPS. The Project runs researches to aim to raise the preparedness for possible large-scale earthquake and Tsunami disasters in Marmara Region and it has four research groups with the following goals. 1) The first one is Marmara Earthquake Source region observational research group. This group has 4 sub-groups such as Seismicity, Geodesy, Electromagnetics and Trench analyses. Preliminary results such as seismicity and crustal deformation on the sea floor in Sea of Marmara have already achieved. 2) The second group focuses on scenario researches of earthquake occurrence along the North Anatolia Fault and precise tsunami simulation in the Marmara region. Research results from this group are to be the model of earthquake occurrence scenario in Sea of Marmara and the

  1. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  2. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  3. SHORT COMMUNICATION Rapid Visual Assessment of Fish ...

    African Journals Online (AJOL)

    the Western Indian Ocean, the trophic structure on Bazaruto's reefs proved typical for the .... The Bazaruto Archipelago is a chain of four islands ... Reef lies furthest from the waters enclosed ...... to a simple index such as carnivore abundance.

  4. The 2014 Greeley, Colorado Earthquakes: Science, Industry, Regulation, and Media

    Science.gov (United States)

    Yeck, W. L.; Sheehan, A. F.; Weingarten, M.; Nakai, J.; Ge, S.

    2014-12-01

    first time that such action had been taken by the COGCC. This presentation provides an overview of the interactions among academic researchers, industry, media, and regulators during the period of rapid response to this earthquake sequence, and the role of seismology in informing those responses.

  5. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  6. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  7. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    Science.gov (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  8. Impacts of the 2010 Haitian earthquake in the diaspora: findings from Little Haiti, Miami, FL.

    Science.gov (United States)

    Kobetz, Erin; Menard, Janelle; Kish, Jonathan; Bishop, Ian; Hazan, Gabrielle; Nicolas, Guerda

    2013-04-01

    In January 2010, a massive earthquake struck Haiti resulting in unprecedented damage. Little attention, however, has focused on the earthquake's mental health impact in the Haitian diaspora community. As part of an established community-based participatory research initiative in Little Haiti, the predominately Haitian neighborhood in Miami, FL, USA, community health workers conducted surveys with neighborhood residents about earthquake-related losses, coping strategies, and depressive/traumatic symptomology. Findings reveal the earthquake strongly impacted the diaspora community and highlights prominent coping strategies. Following the earthquake, only a small percentage of participants self-reported engaging in any negative health behaviors. Instead, a majority relied on their social networks for support. This study contributes to the discourse on designing culturally-responsive mental health initiatives for the Haitian diaspora and the ability of existing community-academic partnerships to rapidly adapt to community needs.

  9. Assessment of earthquake effects - contribution from online communication

    Science.gov (United States)

    D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline

    2014-05-01

    The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.

  10. Electrical streaming potential precursors to catastrophic earthquakes in China

    Directory of Open Access Journals (Sweden)

    F. Qian

    1997-06-01

    Full Text Available The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These transient events triggered by tidal forces exhibited a periodic nature and the statistical phenomenon to migrate towards the epicentre about one month before the earthquake. As a result of events, the pore pressure reached its final equilibrium state and was higher than that in the initial state in a large enough section of the fault region. Consequently, local effective shear strength of the material in the fault zone decreased and finally the catastrophic earthquake was induced. Similar phenomena also occurred one month before the M 7.3 Haichen earthquake of 1975. Therefore, a short term earthquake prediction can be made by electrical measurements, which are the kind of geophysical measurements most closely related to pore fluid behaviors of the deep crust.

  11. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  12. Earthquake Hazard for Aswan High Dam Area

    Science.gov (United States)

    Ismail, Awad

    2016-04-01

    Earthquake activity and seismic hazard analysis are important components of the seismic aspects for very essential structures such as major dams. The Aswan High Dam (AHD) created the second man-made reservoir in the world (Lake Nasser) and is constructed near urban areas pose a high-risk potential for downstream life and property. The Dam area is one of the seismically active regions in Egypt and is occupied with several cross faults, which are dominant in the east-west and north-south. Epicenters were found to cluster around active faults in the northern part of Lake and AHD location. The space-time distribution and the relation of the seismicity with the lake water level fluctuations were studied. The Aswan seismicity separates into shallow and deep seismic zones, between 0 and 14 and 14 and 30 km, respectively. These two seismic zones behave differently over time, as indicated by the seismicity rate, lateral extent, b-value, and spatial clustering. It is characterized by earthquake swarm sequences showing activation of the clustering-events over time and space. The effect of the North African drought (1982 to present) is clearly seen in the reservoir water level. As it decreased and left the most active fault segments uncovered, the shallow activity was found to be more sensitive to rapid discharging than to the filling. This study indicates that geology, topography, lineations in seismicity, offsets in the faults, changes in fault trends and focal mechanisms are closely related. No relation was found between earthquake activity and both-ground water table fluctuations and water temperatures measured in wells located around the Kalabsha area. The peak ground acceleration is estimated in the dam site based on strong ground motion simulation. This seismic hazard analyses have indicated that AHD is stable with the present seismicity. The earthquake epicenters have recently took place approximately 5 km west of the AHD structure. This suggests that AHD dam must be

  13. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    Science.gov (United States)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  14. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Science.gov (United States)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  15. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  16. A rare moderate‐sized (Mw 4.9) earthquake in Kansas: Rupture process of the Milan, Kansas, earthquake of 12 November 2014 and its relationship to fluid injection

    Science.gov (United States)

    Choy, George; Rubinstein, Justin L.; Yeck, William; McNamara, Daniel E.; Mueller, Charles; Boyd, Oliver

    2016-01-01

    The largest recorded earthquake in Kansas occurred northeast of Milan on 12 November 2014 (Mw 4.9) in a region previously devoid of significant seismic activity. Applying multistation processing to data from local stations, we are able to detail the rupture process and rupture geometry of the mainshock, identify the causative fault plane, and delineate the expansion and extent of the subsequent seismic activity. The earthquake followed rapid increases of fluid injection by multiple wastewater injection wells in the vicinity of the fault. The source parameters and behavior of the Milan earthquake and foreshock–aftershock sequence are similar to characteristics of other earthquakes induced by wastewater injection into permeable formations overlying crystalline basement. This earthquake also provides an opportunity to test the empirical relation that uses felt area to estimate moment magnitude for historical earthquakes for Kansas.

  17. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  18. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  19. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  20. Real-time Position Based Population Data Analysis and Visualization Using Heatmap for Hazard Emergency Response

    Science.gov (United States)

    Ding, R.; He, T.

    2017-12-01

    With the increased popularity in mobile applications and services, there has been a growing demand for more advanced mobile technologies that utilize real-time Location Based Services (LBS) data to support natural hazard response efforts. Compared to traditional sources like the census bureau that often can only provide historical and static data, an LBS service can provide more current data to drive a real-time natural hazard response system to more accurately process and assess issues such as population density in areas impacted by a hazard. However, manually preparing or preprocessing the data to suit the needs of the particular application would be time-consuming. This research aims to implement a population heatmap visual analytics system based on real-time data for natural disaster emergency management. System comprised of a three-layered architecture, including data collection, data processing, and visual analysis layers. Real-time, location-based data meeting certain polymerization conditions are collected from multiple sources across the Internet, then processed and stored in a cloud-based data store. Parallel computing is utilized to provide fast and accurate access to the pre-processed population data based on criteria such as the disaster event and to generate a location-based population heatmap as well as other types of visual digital outputs using auxiliary analysis tools. At present, a prototype system, which geographically covers the entire region of China and combines population heat map based on data from the Earthquake Catalogs database has been developed. It Preliminary results indicate that the generation of dynamic population density heatmaps based on the prototype system has effectively supported rapid earthquake emergency rescue and evacuation efforts as well as helping responders and decision makers to evaluate and assess earthquake damage. Correlation analyses that were conducted revealed that the aggregation and movement of people

  1. Estimating economic losses from earthquakes using an empirical approach

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  2. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  3. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  4. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  5. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  6. RAPID-N: Assessing and mapping the risk of natural-hazard impact at industrial installations

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations can have major consequences due to the potential for release of hazardous materials, fires and explosions. Effective Natech risk reduction requires the identification of areas where this risk is high. However, recent studies have shown that there are hardly any methodologies and tools that would allow authorities to identify these areas. To work towards closing this gap, the European Commission's Joint Research Centre has developed the rapid Natech risk assessment and mapping framework RAPID-N. The tool, which is implemented in an online web-based environment, is unique in that it contains all functionalities required for running a full Natech risk analysis simulation (natural hazards severity estimation, equipment damage probability and severity calculation, modeling of the consequences of loss of containment scenarios) and for visualizing its results. The output of RAPID-N are risk summary reports and interactive risk maps which can be used for decision making. Currently, the tool focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. RAPID-N is available at http://rapidn.jrc.ec.europa.eu. This presentation will discuss the results of case-study calculations performed for selected flammable and toxic substances to test the capabilities of RAPID-N both for single- and multi-site earthquake Natech risk assessment. For this purpose, an Istanbul earthquake scenario provided by the Turkish government was used. The results of the exercise show that RAPID-N is a valuable decision-support tool that assesses the Natech risk and maps the consequence end-point distances. These end-point distances are currently defined by 7 kPa overpressure for Vapour Cloud Explosions, 2nd degree burns for pool fire (which is equivalent to a heat radiation of 5 kW/m2 for 40s

  7. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  8. Fault Structural Control on Earthquake Strong Ground Motions: The 2008 Wenchuan Earthquake as an Example

    Science.gov (United States)

    Zhang, Yan; Zhang, Dongli; Li, Xiaojun; Huang, Bei; Zheng, Wenjun; Wang, Yuejun

    2018-02-01

    Continental thrust faulting earthquakes pose severe threats to megacities across the world. Recent events show the possible control of fault structures on strong ground motions. The seismogenic structure of the 2008 Wenchuan earthquake is associated with high-angle listric reverse fault zones. Its peak ground accelerations (PGAs) show a prominent feature of fault zone amplification: the values within the 30- to 40-km-wide fault zone block are significantly larger than those on both the hanging wall and the footwall. The PGA values attenuate asymmetrically: they decay much more rapidly in the footwall than in the hanging wall. The hanging wall effects can be seen on both the vertical and horizontal components of the PGAs, with the former significantly more prominent than the latter. All these characteristics can be adequately interpreted by upward extrusion of the high-angle listric reverse fault zone block. Through comparison with a low-angle planar thrust fault associated with the 1999 Chi-Chi earthquake, we conclude that different fault structures might have controlled different patterns of strong ground motion, which should be taken into account in seismic design and construction.

  9. Does Modern Ideology of Earthquake Engineering Ensure the Declared Levels of Damage of Structures at Earthquakes?

    International Nuclear Information System (INIS)

    Gabrichidze, G.

    2011-01-01

    The basic position of the modern ideology of earthquake engineering is based on the idea that a structure should be designed so that it suffers almost no damage at an earthquake, the occurrence of which is most probable in the given area during the lifetime of the structure. This statement is essentially based on the so-called Performance Based Design, the ideology of the 21 s t century. In the article at tenton is focused on the fact that the modern ideology of earthquake engineering assigns structures to a dangerous zone in which their behavior is defined by processes of damage and destruction of materials, which is a nonequilibrium process and demands application of special refined methods of research. In such conditions use of ratios that correspond to static conditions of loading to describe the process of damage of materials appears to be unfounded. The article raises the question of the necessity of working out a new mathematical model of behavior of materials and structures at rapid intensive impact. (authors)

  10. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  11. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  12. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  13. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  14. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    Science.gov (United States)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  15. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    Science.gov (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  16. Modeling of earthquake ground motion in the frequency domain

    Science.gov (United States)

    Thrainsson, Hjortur

    In recent years, the utilization of time histories of earthquake ground motion has grown considerably in the design and analysis of civil structures. It is very unlikely, however, that recordings of earthquake ground motion will be available for all sites and conditions of interest. Hence, there is a need for efficient methods for the simulation and spatial interpolation of earthquake ground motion. In addition to providing estimates of the ground motion at a site using data from adjacent recording stations, spatially interpolated ground motions can also be used in design and analysis of long-span structures, such as bridges and pipelines, where differential movement is important. The objective of this research is to develop a methodology for rapid generation of horizontal earthquake ground motion at any site for a given region, based on readily available source, path and site characteristics, or (sparse) recordings. The research includes two main topics: (i) the simulation of earthquake ground motion at a given site, and (ii) the spatial interpolation of earthquake ground motion. In topic (i), models are developed to simulate acceleration time histories using the inverse discrete Fourier transform. The Fourier phase differences, defined as the difference in phase angle between adjacent frequency components, are simulated conditional on the Fourier amplitude. Uniformly processed recordings from recent California earthquakes are used to validate the simulation models, as well as to develop prediction formulas for the model parameters. The models developed in this research provide rapid simulation of earthquake ground motion over a wide range of magnitudes and distances, but they are not intended to replace more robust geophysical models. In topic (ii), a model is developed in which Fourier amplitudes and Fourier phase angles are interpolated separately. A simple dispersion relationship is included in the phase angle interpolation. The accuracy of the interpolation

  17. Visual Information Communications International Conference

    CERN Document Server

    Nguyen, Quang Vinh; Zhang, Kang; VINCI'09

    2010-01-01

    Visual Information Communication is based on VINCI'09, The Visual Information Communications International Conference, September 2009 in Sydney, Australia. Topics covered include The Arts of Visual Layout, Presentation & Exploration, The Design of Visual Attributes, Symbols & Languages, Methods for Visual Analytics and Knowledge Discovery, Systems, Interfaces and Applications of Visualization, Methods for Multimedia Data Recognition & Processing. This cutting-edge book addresses the issues of knowledge discovery, end-user programming, modeling, rapid systems prototyping, education, and design activities. Visual Information Communications is an edited volume whose contributors include well-established researchers worldwide, from diverse disciplines including architects, artists, engineers, and scientists. Visual Information Communication is designed for a professional audience composed of practitioners and researchers working in the field of digital design and visual communications. This volume i...

  18. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  19. Earthquake vulnerability assessment of buildings of ward no. 8 of Haldwani-Kathgodam Municipal Corporation, Uttarakhand, India

    Science.gov (United States)

    Bora, Kritika; Pande, Ravindra K.

    2017-07-01

    "Earthquake does not kill people; it is the building which kills people". Earthquake is a sudden event below the surface of the earth which results in vertical and horizontal waves that causes destruction. The main aim of this research is to bring into light the unplanned and non-engineered construction practices growing in the Urban areas. Lack of space and continuous migration from hills has resulted in Multistorey construction. The present study is based on primary data collection through Rapid Visual Screening for the assessment of vulnerability of buildings. "Haldwani-Kathgodam being a new Municipal Corporation located in the foot hills of Himalayas is facing same problem. The seismic zonation brings this area into zone 4 of damage risk. Therefore an assessment to estimate the risk of the built up environment is important. This paper presents a systematic and useful way of assessing physical vulnerability of buildings. The present paper will show how the growing pressure on urban area tends to make the built up environment vulnerable towards seismic activities. The challenge today is to make our living environment safe for living. The day by day growing population pressure on urban area as a migration trend in developing countries is leading to high rise building, no planning and reckless construction. For the sake of saving some money people usually do not take the approval from structural engineer. This unplanned and haphazard construction proves non-resistant towards earthquake and brings lives and properties to death and a stand still. The total no. of household in the current study area is 543 whereas the total population is 2497 (2011). The recent formation of Himalayas makes the area more sensitive towards seismic event. The closeness to the Main Boundary thrust brings it to zone 4 in the Seismic Zonation of India i.e., High Damage Risk Zone

  20. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  1. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  2. Electrical streaming potential precursors to catastrophic earthquakes in China

    OpenAIRE

    Zhao, Y.; Zhao, B.; Qian, F.

    1997-01-01

    The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These...

  3. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  4. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  5. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  6. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  7. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  8. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  9. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  10. Tweeting Earthquakes using TensorFlow

    Science.gov (United States)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  11. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  12. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  13. Earthquake disaster simulation of civil infrastructures from tall buildings to urban areas

    CERN Document Server

    Lu, Xinzheng

    2017-01-01

    Based on more than 12 years of systematic investigation on earthquake disaster simulation of civil infrastructures, this book covers the major research outcomes including a number of novel computational models, high performance computing methods and realistic visualization techniques for tall buildings and urban areas, with particular emphasize on collapse prevention and mitigation in extreme earthquakes, earthquake loss evaluation and seismic resilience. Typical engineering applications to several tallest buildings in the world (e.g., the 632 m tall Shanghai Tower and the 528 m tall Z15 Tower) and selected large cities in China (the Beijing Central Business District, Xi'an City, Taiyuan City and Tangshan City) are also introduced to demonstrate the advantages of the proposed computational models and techniques. The high-fidelity computational model developed in this book has proven to be the only feasible option to date for earthquake-induced collapse simulation of supertall buildings that are higher than 50...

  14. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    Science.gov (United States)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  15. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  16. Earthquake related displacement fields near underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Zandt, G.; Bouchon, M.

    1979-04-01

    Relative displacements of rock masses are evaluated in terms of geological evidence, seismological evidence, data from simulation experiments, and analytical predictive models. Numerical models have been developed to determine displacement fields as a function of depth, distance, and azimuth from an earthquake source. Computer calculations for several types of faults indicate that displacements decrease rapidly with distance from the fault, but that displacements can either increase or decrease as a function of depth depending on the type and geometry of the fault. For long shallow vertical strike-slip faults the displacement decreases markedly with depth. For square strike slip faults and for dip slip faults displacement does not decrease as markedly with depth. Geologic structure, material properties, and depth affect the seismic source spectrum. Amplification of the high frequencies of shear waves is larger by a factor of about 2 for layered geologic models than for an elastic half space

  17. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  18. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  19. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  20. Visual management support system

    Science.gov (United States)

    Lee Anderson; Jerry Mosier; Geoffrey Chandler

    1979-01-01

    The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...

  1. Protracted fluvial recovery from medieval earthquakes, Pokhara, Nepal

    Science.gov (United States)

    Stolle, Amelie; Bernhardt, Anne; Schwanghart, Wolfgang; Andermann, Christoff; Schönfeldt, Elisabeth; Seidemann, Jan; Adhikari, Basanta R.; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver

    2016-04-01

    River response to strong earthquake shaking in mountainous terrain often entails the flushing of sediments delivered by widespread co-seismic landsliding. Detailed mass-balance studies following major earthquakes in China, Taiwan, and New Zealand suggest fluvial recovery times ranging from several years to decades. We report a detailed chronology of earthquake-induced valley fills in the Pokhara region of western-central Nepal, and demonstrate that rivers continue to adjust to several large medieval earthquakes to the present day, thus challenging the notion of transient fluvial response to seismic disturbance. The Pokhara valley features one of the largest and most extensively dated sedimentary records of earthquake-triggered sedimentation in the Himalayas, and independently augments paleo-seismological archives obtained mainly from fault trenches and historic documents. New radiocarbon dates from the catastrophically deposited Pokhara Formation document multiple phases of extremely high geomorphic activity between ˜700 and ˜1700 AD, preserved in thick sequences of alternating fluvial conglomerates, massive mud and silt beds, and cohesive debris-flow deposits. These dated fan-marginal slackwater sediments indicate pronounced sediment pulses in the wake of at least three large medieval earthquakes in ˜1100, 1255, and 1344 AD. We combine these dates with digital elevation models, geological maps, differential GPS data, and sediment logs to estimate the extent of these three pulses that are characterized by sedimentation rates of ˜200 mm yr-1 and peak rates as high as 1,000 mm yr-1. Some 5.5 to 9 km3 of material infilled the pre-existing topography, and is now prone to ongoing fluvial dissection along major canyons. Contemporary river incision into the Pokhara Formation is rapid (120-170 mm yr-1), triggering widespread bank erosion, channel changes, and very high sediment yields of the order of 103 to 105 t km-2 yr-1, that by far outweigh bedrock denudation rates

  2. Comprehensive treatment for gas gangrene of the limbs in earthquakes.

    Science.gov (United States)

    Wang, Yue; Lu, Bo; Hao, Peng; Yan, Meng-ning; Dai, Ke-rong

    2013-10-01

    Mortality rates for patients with gas gangrene from trauma or surgery are as high as 25%, but they increase to 50%-80% for patients injured in natural hazards. Early diagnosis and treatment are essential for these patients. We retrospectively analyzed the clinical characteristics and therapeutic results of 19 patients with gas gangrene of the limbs, who were injured in the May 2008 earthquake in the Wenchuan district of China's Sichuan province and treated in our hospital, to seek how to best diagnose and treat earthquake-induced gas gangrene. Of 226 patients with limbs open injuries sustained during the earthquake, 53 patients underwent smear analysis of wound exudates and gas gangrene was diagnosed in 19 patients. The average elapsed time from injury to arrival at the hospital was 72 hours, from injury to definitive diagnosis was 4.3 days, and from diagnosis to conversion of negative findings on wound smear analysis to positive findings was 12.7 days. Anaerobic cultures were also obtained before wound closure. The average elapsed time from completion of surgery to recovery of normal vital signs was 6.3 days. Of the 19 patients, 16 were treated with open amputation, two with closed amputation, and 1 with successful limb salvage; 18 patients were successfully treated and one died. In earthquakes, rapid, accurate screening and isolation are essential to successful treatment of gas gangrene and helpful in preventing nosocomial diffusion. Early and thorough debridement, open amputation, and active supportive treatment can produce satisfactory therapeutic results.

  3. Strong intermediate-depth Vreancea earthquakes: Damage capacity in Bulgaria

    International Nuclear Information System (INIS)

    Kouteva-Guentcheva, M.P.; Paskaleva, I.P.; Panza, G.F.

    2008-08-01

    The sustainable development of the society depends not only on a reasonable policy for economical growth but also on the reasonable management of natural risks. The regional earthquake danger due to the Vrancea intermediate-depth earthquakes dominates the hazard of NE Bulgaria. These quakes have particularly long-period and far-reaching effects, causing damages at large epicentral distances. Vrancea events energy attenuates considerably less rapidly than that of the wave field radiated by the seismically active zones in Bulgaria. The available strong motion records at Russe, NE Bulgaria, due to both Vrancea events - August 30, 1986 and May 30, 1990 show higher seismic response spectra amplitudes for periods up to 0.6 s for the horizontal components, compared to the values given in the Bulgarian Code and Eurocode 8. A neo-deterministic analytical procedure which models the wavefield generated by a realistic earthquake source, as it propagates through a laterally varying anelastic medium, is applied to obtain the seismic loading at Russe. After proper validation, using the few available data and parametric analyses, from the synthesized seismic signals damage capacity of selected scenario Vrancea quakes is estimated and compared with available capacity curves for some reinforced concrete and masonry structures, representative of the Balkan Region. The performed modelling has shown that the earthquake focal mechanisms control the seismic loading much more than the local geology, and that the site response should be analyzed by considering the whole thickness of sediments until the bedrock, and not only the topmost 30 m. (author)

  4. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  5. Development of High-speed Visualization System of Hypocenter Data Using CUDA-based GPU computing

    Science.gov (United States)

    Kumagai, T.; Okubo, K.; Uchida, N.; Matsuzawa, T.; Kawada, N.; Takeuchi, N.

    2014-12-01

    After the Great East Japan Earthquake on March 11, 2011, intelligent visualization of seismic information is becoming important to understand the earthquake phenomena. On the other hand, to date, the quantity of seismic data becomes enormous as a progress of high accuracy observation network; we need to treat many parameters (e.g., positional information, origin time, magnitude, etc.) to efficiently display the seismic information. Therefore, high-speed processing of data and image information is necessary to handle enormous amounts of seismic data. Recently, GPU (Graphic Processing Unit) is used as an acceleration tool for data processing and calculation in various study fields. This movement is called GPGPU (General Purpose computing on GPUs). In the last few years the performance of GPU keeps on improving rapidly. GPU computing gives us the high-performance computing environment at a lower cost than before. Moreover, use of GPU has an advantage of visualization of processed data, because GPU is originally architecture for graphics processing. In the GPU computing, the processed data is always stored in the video memory. Therefore, we can directly write drawing information to the VRAM on the video card by combining CUDA and the graphics API. In this study, we employ CUDA and OpenGL and/or DirectX to realize full-GPU implementation. This method makes it possible to write drawing information to the VRAM on the video card without PCIe bus data transfer: It enables the high-speed processing of seismic data. The present study examines the GPU computing-based high-speed visualization and the feasibility for high-speed visualization system of hypocenter data.

  6. Math for visualization, visualizing math

    NARCIS (Netherlands)

    Wijk, van J.J.; Hart, G.; Sarhangi, R.

    2013-01-01

    I present an overview of our work in visualization, and reflect on the role of mathematics therein. First, mathematics can be used as a tool to produce visualizations, which is illustrated with examples from information visualization, flow visualization, and cartography. Second, mathematics itself

  7. Visual art and visual perception

    NARCIS (Netherlands)

    Koenderink, Jan J.

    2015-01-01

    Visual art and visual perception ‘Visual art’ has become a minor cul-de-sac orthogonal to THE ART of the museum directors and billionaire collectors. THE ART is conceptual, instead of visual. Among its cherished items are the tins of artist’s shit (Piero Manzoni, 1961, Merda d’Artista) “worth their

  8. Priming and the guidance by visual and categorical templates in visual search

    NARCIS (Netherlands)

    Wilschut, A.M.; Theeuwes, J.; Olivers, C.N.L.

    2014-01-01

    Visual search is thought to be guided by top-down templates that are held in visual working memory. Previous studies have shown that a search-guiding template can be rapidly and strongly implemented from a visual cue, whereas templates are less effective when based on categorical cues. Direct visual

  9. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  10. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  11. Acoustic emission diagnosis of concrete-piles damaged by earthquakes

    International Nuclear Information System (INIS)

    Shiotami, Tomoki; Sakaino, Norio; Ohtsu, Masayasu; Shigeishi, Mitsuhiro

    1997-01-01

    Earthquakes often impose unexpected damage on structures. Concerning the soundness of the structure, the upper portion is easily estimated by visual observation, while the lower portion located in deep underground is difficult to be estimated. Thus there exist few effective methods to investigate underground structures. In this paper, a new inspection technique for damage evaluation of concrete-piles utilizing acoustic emission (AE) is proposed, and is verified by a series of experiments. Firstly, such basic characteristics as the attenuation and effective wave-guides for detecting AE underground, are examined through laboratory tests. Secondary, fracture tests of full-scale prefabricated concrete piles are conducted, and the characteristics of the AE are examined. Finally, actual concrete-piles attacked by the 1995 Great Hanshin Earthquake are investigated. Results confirm that the estimated damages by the proposed method are in good agreement with actual damaged locations. Thus, the method is very effective for the diagnosis of the concrete-piles.

  12. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Aftershocks and Postseismic Effects

    Science.gov (United States)

    Reasenberg, Paul A.

    1997-01-01

    While the damaging effects of the earthquake represent a significant social setback and economic loss, the geophysical effects have produced a wealth of data that have provided important insights into the structure and mechanics of the San Andreas Fault system. Generally, the period after a large earthquake is vitally important to monitor. During this part of the seismic cycle, the primary fault and the surrounding faults, rock bodies, and crustal fluids rapidly readjust in response to the earthquake's sudden movement. Geophysical measurements made at this time can provide unique information about fundamental properties of the fault zone, including its state of stress and the geometry and frictional/rheological properties of the faults within it. Because postseismic readjustments are rapid compared with corresponding changes occurring in the preseismic period, the amount and rate of information that is available during the postseismic period is relatively high. From a geophysical viewpoint, the occurrence of the Loma Prieta earthquake in a section of the San Andreas fault zone that is surrounded by multiple and extensive geophysical monitoring networks has produced nothing less than a scientific bonanza. The reports assembled in this chapter collectively examine available geophysical observations made before and after the earthquake and model the earthquake's principal postseismic effects. The chapter covers four broad categories of postseismic effect: (1) aftershocks; (2) postseismic fault movements; (3) postseismic surface deformation; and (4) changes in electrical conductivity and crustal fluids.

  13. Impacts of hydrogeological characteristics on groundwater-level changes induced by earthquakes

    Science.gov (United States)

    Liu, Ching-Yi; Chia, Yeeping; Chuang, Po-Yu; Chiu, Yung-Chia; Tseng, Tai-Lin

    2018-03-01

    Changes in groundwater level during earthquakes have been reported worldwide. In this study, field observations of co-seismic groundwater-level changes in wells under different aquifer conditions and sampling intervals due to near-field earthquake events in Taiwan are presented. Sustained changes, usually observed immediately after earthquakes, are found in the confined aquifer. Oscillatory changes due to the dynamic strain triggered by passing earthquake waves can only be recorded by a high-frequency data logger. While co-seismic changes recover rapidly in an unconfined aquifer, they can sustain for months or longer in a confined aquifer. Three monitoring wells with long-term groundwater-level data were examined to understand the association of co-seismic changes with local hydrogeological conditions. The finite element software ABAQUS is used to simulate the pore-pressure changes induced by the displacements due to fault rupture. The calculated co-seismic change in pore pressure is related to the compressibility of the formation. The recovery rate of the change is rapid in the unconfined aquifer due to the hydrostatic condition at the water table, but slow in the confined aquifer due to the less permeable confining layer. Fracturing of the confining layer during earthquakes may enhance the dissipation of pore pressure and induce the discharge of the confined aquifer. The study results indicated that aquifer characteristics play an important role in determining groundwater-level changes during and after earthquakes.

  14. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  15. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  16. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  17. Different damage observed in the villages of Pescara del Tronto and Vezzano after the M6.0 August 24, 2016 central Italy earthquake and site effects analysis

    Directory of Open Access Journals (Sweden)

    Angelo Masi

    2017-01-01

    Full Text Available The authors have surveyed many damaged villages located at the epicentre of the ML=6.0 earthquake which occurred on August 24, 2016 in central Italy. Some unexpected anomalies were discovered such as very different levels of damage in Vezzano and Pescara del Tronto villages (Arquata del Tronto Municipality, Ascoli Piceno province. The two villages are situated just 1300 meters from each other. Pescara del Tronto suffered very heavy damage with many masonry building collapses and 48 fatalities, while Vezzano suffered only light damage to few buildings. This paper provides a preliminar analysis from an engineering and geophysics perspective. Particularly, rapid visual surveys were carried out in the two villages in order to detect possible significant differences in the vulnerability of their building stocks and site geophysical investigations were performed to detect possible local amplification effects.

  18. Joko Tingkir program for estimating tsunami potential rapidly

    Energy Technology Data Exchange (ETDEWEB)

    Madlazim,, E-mail: m-lazim@physics.its.ac.id; Hariyono, E., E-mail: m-lazim@physics.its.ac.id [Department of Physics, Faculty of Mathematics and Natural Sciences, Universitas Negeri Surabaya (UNESA) , Jl. Ketintang, Surabaya 60231 (Indonesia)

    2014-09-25

    The purpose of the study was to estimate P-wave rupture durations (T{sub dur}), dominant periods (T{sub d}) and exceeds duration (T{sub 50Ex}) simultaneously for local events, shallow earthquakes which occurred off the coast of Indonesia. Although the all earthquakes had parameters of magnitude more than 6,3 and depth less than 70 km, part of the earthquakes generated a tsunami while the other events (Mw=7.8) did not. Analysis using Joko Tingkir of the above stated parameters helped understand the tsunami generation of these earthquakes. Measurements from vertical component broadband P-wave quake velocity records and determination of the above stated parameters can provide a direct procedure for assessing rapidly the potential for tsunami generation. The results of the present study and the analysis of the seismic parameters helped explain why the events generated a tsunami, while the others did not.

  19. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.

    2015-12-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  20. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  1. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  2. Geodetic constraints on afterslip characteristics following the March 9, 2011, Sanriku-oki earthquake, Japan

    Science.gov (United States)

    Ohta, Yusaku; Hino, Ryota; Inazu, Daisuke; Ohzono, Mako; Ito, Yoshihiro; Mishina, Masaaki; Iinuma, Takeshi; Nakajima, Junichi; Osada, Yukihito; Suzuki, Kensuke; Fujimoto, Hiromi; Tachibana, Kenji; Demachi, Tomotsugu; Miura, Satoshi

    2012-08-01

    A magnitude 7.3 foreshock occurred at the subducting Pacific plate interface on March 9, 2011, 51 h before the magnitude 9.0 Tohoku earthquake off the Pacific coast of Japan. We propose a coseismic and postseismic afterslip model of the magnitude 7.3 event based on a global positioning system network and ocean bottom pressure gauge sites. The estimated coseismic slip and afterslip areas show complementary spatial distributions; the afterslip distribution is located up-dip of the coseismic slip for the foreshock and northward of hypocenter of the Tohoku earthquake. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study. The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51h. A volumetric strainmeter time series also suggests that this event advanced with a rapid decay time constant compared with other typical large earthquakes.

  3. Induced earthquakes. Sharp increase in central Oklahoma seismicity since 2008 induced by massive wastewater injection.

    Science.gov (United States)

    Keranen, K M; Weingarten, M; Abers, G A; Bekins, B A; Ge, S

    2014-07-25

    Unconventional oil and gas production provides a rapidly growing energy source; however, high-production states in the United States, such as Oklahoma, face sharply rising numbers of earthquakes. Subsurface pressure data required to unequivocally link earthquakes to wastewater injection are rarely accessible. Here we use seismicity and hydrogeological models to show that fluid migration from high-rate disposal wells in Oklahoma is potentially responsible for the largest swarm. Earthquake hypocenters occur within disposal formations and upper basement, between 2- and 5-kilometer depth. The modeled fluid pressure perturbation propagates throughout the same depth range and tracks earthquakes to distances of 35 kilometers, with a triggering threshold of ~0.07 megapascals. Although thousands of disposal wells operate aseismically, four of the highest-rate wells are capable of inducing 20% of 2008 to 2013 central U.S. seismicity. Copyright © 2014, American Association for the Advancement of Science.

  4. The use of waveform shapes to automatically determine earthquake focal depth

    Science.gov (United States)

    Sipkin, S.A.

    2000-01-01

    Earthquake focal depth is an important parameter for rapidly determining probable damage caused by a large earthquake. In addition, it is significant both for discriminating between natural events and explosions and for discriminating between tsunamigenic and nontsunamigenic earthquakes. For the purpose of notifying emergency management and disaster relief organizations as well as issuing tsunami warnings, potential time delays in determining source parameters are particularly detrimental. We present a method for determining earthquake focal depth that is well suited for implementation in an automated system that utilizes the wealth of broadband teleseismic data that is now available in real time from the global seismograph networks. This method uses waveform shapes to determine focal depth and is demonstrated to be valid for events with magnitudes as low as approximately 5.5.

  5. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  6. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  7. Building damage assessment after the earthquake in Haiti using two postevent satellite stereo imagery and DSMs

    DEFF Research Database (Denmark)

    Tian, Jiaojiao; Nielsen, Allan Aasbjerg; Reinartz, Peter

    2015-01-01

    In this article, a novel after-disaster building damage monitoring method is presented. This method combines the multispectral imagery and digital surface models (DSMs) from stereo matching of two dates to obtain three kinds of changes: collapsed buildings, newly built buildings and temporary she...... changes after the 2010 Haiti earthquake, and the obtained results are further evaluated both visually and numerically....

  8. Flow visualization

    CERN Document Server

    Merzkirch, Wolfgang

    1974-01-01

    Flow Visualization describes the most widely used methods for visualizing flows. Flow visualization evaluates certain properties of a flow field directly accessible to visual perception. Organized into five chapters, this book first presents the methods that create a visible flow pattern that could be investigated by visual inspection, such as simple dye and density-sensitive visualization methods. It then deals with the application of electron beams and streaming birefringence. Optical methods for compressible flows, hydraulic analogy, and high-speed photography are discussed in other cha

  9. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  10. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  11. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  12. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    Science.gov (United States)

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  13. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  14. Radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  15. Radon as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  16. Earthquake location in island arcs

    Science.gov (United States)

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  17. Rrsm: The European Rapid Raw Strong-Motion Database

    Science.gov (United States)

    Cauzzi, C.; Clinton, J. F.; Sleeman, R.; Domingo Ballesta, J.; Kaestli, P.; Galanis, O.

    2014-12-01

    We introduce the European Rapid Raw Strong-Motion database (RRSM), a Europe-wide system that provides parameterised strong motion information, as well as access to waveform data, within minutes of the occurrence of strong earthquakes. The RRSM significantly differs from traditional earthquake strong motion dissemination in Europe, which has focused on providing reviewed, processed strong motion parameters, typically with significant delays. As the RRSM provides rapid open access to raw waveform data and metadata and does not rely on external manual waveform processing, RRSM information is tailored to seismologists and strong-motion data analysts, earthquake and geotechnical engineers, international earthquake response agencies and the educated general public. Access to the RRSM database is via a portal at http://www.orfeus-eu.org/rrsm/ that allows users to query earthquake information, peak ground motion parameters and amplitudes of spectral response; and to select and download earthquake waveforms. All information is available within minutes of any earthquake with magnitude ≥ 3.5 occurring in the Euro-Mediterranean region. Waveform processing and database population are performed using the waveform processing module scwfparam, which is integrated in SeisComP3 (SC3; http://www.seiscomp3.org/). Earthquake information is provided by the EMSC (http://www.emsc-csem.org/) and all the seismic waveform data is accessed at the European Integrated waveform Data Archive (EIDA) at ORFEUS (http://www.orfeus-eu.org/index.html), where all on-scale data is used in the fully automated processing. As the EIDA community is continually growing, the already significant number of strong motion stations is also increasing and the importance of this product is expected to also increase. Real-time RRSM processing started in June 2014, while past events have been processed in order to provide a complete database back to 2005.

  18. Crowdsourcing earthquake damage assessment using remote sensing imagery

    Directory of Open Access Journals (Sweden)

    Stuart Gill

    2011-06-01

    Full Text Available This paper describes the evolution of recent work on using crowdsourced analysis of remote sensing imagery, particularly high-resolution aerial imagery, to provide rapid, reliable assessments of damage caused by earthquakes and potentially other disasters. The initial effort examined online imagery taken after the 2008 Wenchuan, China, earthquake. A more recent response to the 2010 Haiti earthquake led to the formation of an international consortium: the Global Earth Observation Catastrophe Assessment Network (GEO-CAN. The success of GEO-CAN in contributing to the official damage assessments made by the Government of Haiti, the United Nations, and the World Bank led to further development of a web-based interface. A current initiative in Christchurch, New Zealand, is underway where remote sensing experts are analyzing satellite imagery, geotechnical engineers are marking liquefaction areas, and structural engineers are identifying building damage. The current site includes online training to improve the accuracy of the assessments and make it possible for even novice users to contribute to the crowdsourced solution. The paper discusses lessons learned from these initiatives and presents a way forward for using crowdsourced remote sensing as a tool for rapid assessment of damage caused by natural disasters around the world.

  19. Geodetic characteristic of the postseismic deformation following the interplate large earthquake along the Japan Trench (Invited)

    Science.gov (United States)

    Ohta, Y.; Hino, R.; Ariyoshi, K.; Matsuzawa, T.; Mishina, M.; Sato, T.; Inazu, D.; Ito, Y.; Tachibana, K.; Demachi, T.; Miura, S.

    2013-12-01

    On March 9, 2011 at 2:45 (UTC), an M7.3 interplate earthquake (hereafter foreshock) occurred ~45 km northeast of the epicenter of the M9.0 2011 Tohoku earthquake. This foreshock preceded the 2011 Tohoku earthquake by 51 hours. Ohta et al., (2012, GRL) estimated co- and postseismic afterslip distribution based on a dense GPS network and ocean bottom pressure gauge sites. They found the afterslip distribution was mainly concentrated in the up-dip extension of the coseismic slip. The coseismic slip and afterslip distribution of the foreshock were also located in the slip deficit region (between 20-40m slip) of the coiseismic slip of the M9.0 mainshock. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study (Kato et al., 2012, Science). The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51 hours. They also pointed out that a volumetric strainmeter time series suggests that this event advanced with a rapid decay time constant (4.8 h) compared with other typical large earthquakes. The decay time constant of the afterslip may reflect the frictional property of the plate interface, especially effective normal stress controlled by fluid. For verification of the short decay time constant of the foreshock, we investigated the postseismic deformation characteristic following the 1989 and 1992 Sanriku-Oki earthquakes (M7.1 and M6.9), 2003 and 2005 Miyagi-Oki earthquakes (M6.8 and M7.2), and 2008 Fukushima-Oki earthquake (M6.9). We used four components extensometer at Miyako (39.59N, 141.98E) on the Sanriku coast for 1989 and 1992 event. For 2003, 2005 and 2008 events, we used volumetric strainmeter at Kinka-zan (38.27N, 141.58E) and Enoshima (38.27N, 141.60E). To extract the characteristics of the postseismic deformation, we fitted the logarithmic function. The estimated decay time constants for each earthquake had almost similar range (1

  20. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  1. Visual field

    Science.gov (United States)

    ... your visual field. How the Test is Performed Confrontation visual field exam. This is a quick and ... to achieve this important distinction for online health information and services. Learn more about A.D.A. ...

  2. Reading a 400,000-year record of earthquake frequency for an intraplate fault.

    Science.gov (United States)

    Williams, Randolph T; Goodwin, Laurel B; Sharp, Warren D; Mozley, Peter S

    2017-05-09

    Our understanding of the frequency of large earthquakes at timescales longer than instrumental and historical records is based mostly on paleoseismic studies of fast-moving plate-boundary faults. Similar study of intraplate faults has been limited until now, because intraplate earthquake recurrence intervals are generally long (10s to 100s of thousands of years) relative to conventional paleoseismic records determined by trenching. Long-term variations in the earthquake recurrence intervals of intraplate faults therefore are poorly understood. Longer paleoseismic records for intraplate faults are required both to better quantify their earthquake recurrence intervals and to test competing models of earthquake frequency (e.g., time-dependent, time-independent, and clustered). We present the results of U-Th dating of calcite veins in the Loma Blanca normal fault zone, Rio Grande rift, New Mexico, United States, that constrain earthquake recurrence intervals over much of the past ∼550 ka-the longest direct record of seismic frequency documented for any fault to date. The 13 distinct seismic events delineated by this effort demonstrate that for >400 ka, the Loma Blanca fault produced periodic large earthquakes, consistent with a time-dependent model of earthquake recurrence. However, this time-dependent series was interrupted by a cluster of earthquakes at ∼430 ka. The carbon isotope composition of calcite formed during this seismic cluster records rapid degassing of CO 2 , suggesting an interval of anomalous fluid source. In concert with U-Th dates recording decreased recurrence intervals, we infer seismicity during this interval records fault-valve behavior. These data provide insight into the long-term seismic behavior of the Loma Blanca fault and, by inference, other intraplate faults.

  3. Reading a 400,000-year record of earthquake frequency for an intraplate fault

    Science.gov (United States)

    Williams, Randolph T.; Goodwin, Laurel B.; Sharp, Warren D.; Mozley, Peter S.

    2017-05-01

    Our understanding of the frequency of large earthquakes at timescales longer than instrumental and historical records is based mostly on paleoseismic studies of fast-moving plate-boundary faults. Similar study of intraplate faults has been limited until now, because intraplate earthquake recurrence intervals are generally long (10s to 100s of thousands of years) relative to conventional paleoseismic records determined by trenching. Long-term variations in the earthquake recurrence intervals of intraplate faults therefore are poorly understood. Longer paleoseismic records for intraplate faults are required both to better quantify their earthquake recurrence intervals and to test competing models of earthquake frequency (e.g., time-dependent, time-independent, and clustered). We present the results of U-Th dating of calcite veins in the Loma Blanca normal fault zone, Rio Grande rift, New Mexico, United States, that constrain earthquake recurrence intervals over much of the past ˜550 ka—the longest direct record of seismic frequency documented for any fault to date. The 13 distinct seismic events delineated by this effort demonstrate that for >400 ka, the Loma Blanca fault produced periodic large earthquakes, consistent with a time-dependent model of earthquake recurrence. However, this time-dependent series was interrupted by a cluster of earthquakes at ˜430 ka. The carbon isotope composition of calcite formed during this seismic cluster records rapid degassing of CO2, suggesting an interval of anomalous fluid source. In concert with U-Th dates recording decreased recurrence intervals, we infer seismicity during this interval records fault-valve behavior. These data provide insight into the long-term seismic behavior of the Loma Blanca fault and, by inference, other intraplate faults.

  4. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    Science.gov (United States)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain

  5. Vrancea earthquakes. Courses for specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    cities. There are zones in Bucharest which have fundamental periods of the soil and geological structures very different and during August 30, 1986 Vrancea earthquake the variations of the recorded peak accelerations were from simple to triple ones; - Shake map. This shake map now under completion will allow us rapidly portray, in real time, the extent of shaking during of earthquake in a simplified form suitable for immediate post-earthquake decision-making; - Seismic tomography of dams for avoiding catastrophes. There are applications of seismic tomography and in general of seismic imaging like methods of investigation for mitigation of earthquake effects. (authors)

  6. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

    Science.gov (United States)

    Bell, Andrew F.

    2018-02-01

    Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

  7. Anomalies in oil and water wells and the Tangshan earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Wang, W.

    1980-01-01

    Bin County, Shandong Province, has a complicated fault structure resulting from the interaction of a number of fault blocks. An examination of the behavior of oil wells in various oilfields located in faulting areas showed anomalies in 7 of them related to the Tangshan earthquake. Three wells (Nos. 88, 101, and 102) showed sharp peaks in output within a month before the earthquake. One well (No. 278) showed a sharp peak in the oil-gas ratio in April and July of 1976. There was a sharp increase in the water content of the oil produced by one well (No. 285) in July. Finally, one well (4-Xi4-10) showed a decrease in the rate of change of static pressure, starting in March 1976 and achieving a plateau in June which persisted until October before the static pressure again began to change more rapidly.

  8. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  9. Measuring the size of an earthquake

    Science.gov (United States)

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  10. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  11. QuakeUp: An advanced tool for a network-based Earthquake Early Warning system

    Science.gov (United States)

    Zollo, Aldo; Colombelli, Simona; Caruso, Alessandro; Elia, Luca; Brondi, Piero; Emolo, Antonio; Festa, Gaetano; Martino, Claudio; Picozzi, Matteo

    2017-04-01

    predicted P-wave amplitude at a dense spatial grid, including the nodes of the accelerometer/velocimeter array deployed in the earthquake source area. Within times of the order of ten seconds from the earthquake origin, the information about the area where moderate to strong ground shaking is expected to occur, can be sent to inner and outer sites, allowing the activation of emergency measurements to protect people , secure industrial facilities and optimize the site resilience after the disaster. Depending of the network density and spatial source coverage, this method naturally accounts for effects related to the earthquake rupture extent (e.g. source directivity) and spatial variability of strong ground motion related to crustal wave propagation and site amplification. In QuakeUp, the P-wave parameters are continuously measured, using progressively expanded P-wave time windows, and providing evolutionary and reliable estimates of the ground shaking distribution, especially in the case of very large events. Furthermore, to minimize the S-wave contamination on the P-wave signal portion, an efficient algorithm, based on the real-time polarization analysis of the three-component seismogram, for the automatic detection of the S-wave arrival time has been included. The final output of QuakeUp will be an automatic alert message that is transmitted to sites to be secured during the earthquake emergency. The message contains all relevant information about the expected potential damage at the site and the time available for security actions (lead-time) after the warning. A global view of the system performance during and after the event (in play-back mode) is obtained through an end-user visual display, where the most relevant pieces of information will be displayed and updated as soon as new data are available. The software platform Quake-Up is essentially aimed at improving the reliability and the accuracy in terms of parameter estimation, minimizing the uncertainties in the

  12. Earthquake scaling laws for rupture geometry and slip heterogeneity

    Science.gov (United States)

    Thingbaijam, Kiran K. S.; Mai, P. Martin; Goda, Katsuichiro

    2016-04-01

    We analyze an extensive compilation of finite-fault rupture models to investigate earthquake scaling of source geometry and slip heterogeneity to derive new relationships for seismic and tsunami hazard assessment. Our dataset comprises 158 earthquakes with a total of 316 rupture models selected from the SRCMOD database (http://equake-rc.info/srcmod). We find that fault-length does not saturate with earthquake magnitude, while fault-width reveals inhibited growth due to the finite seismogenic thickness. For strike-slip earthquakes, fault-length grows more rapidly with increasing magnitude compared to events of other faulting types. Interestingly, our derived relationship falls between the L-model and W-model end-members. In contrast, both reverse and normal dip-slip events are more consistent with self-similar scaling of fault-length. However, fault-width scaling relationships for large strike-slip and normal dip-slip events, occurring on steeply dipping faults (δ~90° for strike-slip faults, and δ~60° for normal faults), deviate from self-similarity. Although reverse dip-slip events in general show self-similar scaling, the restricted growth of down-dip fault extent (with upper limit of ~200 km) can be seen for mega-thrust subduction events (M~9.0). Despite this fact, for a given earthquake magnitude, subduction reverse dip-slip events occupy relatively larger rupture area, compared to shallow crustal events. In addition, we characterize slip heterogeneity in terms of its probability distribution and spatial correlation structure to develop a complete stochastic random-field characterization of earthquake slip. We find that truncated exponential law best describes the probability distribution of slip, with observable scale parameters determined by the average and maximum slip. Applying Box-Cox transformation to slip distributions (to create quasi-normal distributed data) supports cube-root transformation, which also implies distinctive non-Gaussian slip

  13. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  14. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  15. Using Smartphones to Detect Earthquakes

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  16. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  17. Data visualization

    CERN Document Server

    Azzam, Tarek

    2013-01-01

    Do you communicate data and information to stakeholders? In Part 1, we introduce recent developments in the quantitative and qualitative data visualization field and provide a historical perspective on data visualization, its potential role in evaluation practice, and future directions. Part 2 delivers concrete suggestions for optimally using data visualization in evaluation, as well as suggestions for best practices in data visualization design. It focuses on specific quantitative and qualitative data visualization approaches that include data dashboards, graphic recording, and geographic information systems (GIS). Readers will get a step-by-step process for designing an effective data dashboard system for programs and organizations, and various suggestions to improve their utility.

  18. New Empirical Earthquake Source‐Scaling Laws

    KAUST Repository

    Thingbaijam, Kiran Kumar S.

    2017-12-13

    We develop new empirical scaling laws for rupture width W, rupture length L, rupture area A, and average slip D, based on a large database of rupture models. The database incorporates recent earthquake source models in a wide magnitude range (M 5.4–9.2) and events of various faulting styles. We apply general orthogonal regression, instead of ordinary least-squares regression, to account for measurement errors of all variables and to obtain mutually self-consistent relationships. We observe that L grows more rapidly with M compared to W. The fault-aspect ratio (L/W) tends to increase with fault dip, which generally increases from reverse-faulting, to normal-faulting, to strike-slip events. At the same time, subduction-inter-face earthquakes have significantly higher W (hence a larger rupture area A) compared to other faulting regimes. For strike-slip events, the growth of W with M is strongly inhibited, whereas the scaling of L agrees with the L-model behavior (D correlated with L). However, at a regional scale for which seismogenic depth is essentially fixed, the scaling behavior corresponds to the W model (D not correlated with L). Self-similar scaling behavior with M − log A is observed to be consistent for all the cases, except for normal-faulting events. Interestingly, the ratio D/W (a proxy for average stress drop) tends to increase with M, except for shallow crustal reverse-faulting events, suggesting the possibility of scale-dependent stress drop. The observed variations in source-scaling properties for different faulting regimes can be interpreted in terms of geological and seismological factors. We find substantial differences between our new scaling relationships and those of previous studies. Therefore, our study provides critical updates on source-scaling relations needed in seismic–tsunami-hazard analysis and engineering applications.

  19. Visual Literacy and Visual Thinking.

    Science.gov (United States)

    Hortin, John A.

    It is proposed that visual literacy be defined as the ability to understand (read) and use (write) images and to think and learn in terms of images. This definition includes three basic principles: (1) visuals are a language and thus analogous to verbal language; (2) a visually literate person should be able to understand (read) images and use…

  20. Visual Literacy and Visual Culture.

    Science.gov (United States)

    Messaris, Paul

    Familiarity with specific images or sets of images plays a role in a culture's visual heritage. Two questions can be asked about this type of visual literacy: Is this a type of knowledge that is worth building into the formal educational curriculum of our schools? What are the educational implications of visual literacy? There is a three-part…

  1. Development of an Android App for notification and reporting of natural disaster such as earthquakes and tsunamis

    Science.gov (United States)

    Richter, Steffen; Hammitzsch, Martin

    2013-04-01

    Disasters like the Tohoku tsunami in March 2011 and the earthquake in Haiti in January 2010, have shown clearly that the rapid detection of possible negative impact on population and infrastructure is crucial for the rapid organization of effective counter measures integration activities. It has turned out that effective planning of relief and rescue measures requires both information provided by governmental authorities and feedback of the general public. Every citizen experiencing the events directly on site becomes a potential witness and can provide valuable information about the disaster. Citizens can use various information channels to communicate and share their experiences. During the last years, the crowdsourcing approach has gained the attention of users of modern communication and information systems. The term crowdsourcing describes the interactive collaboration of voluntary users on the Internet, working on a common topic. A similar approach is mobile crowdsourcing which evolved in the quickly growing community of smartphone users: Crowdsourcing platforms provide additional application scenarios for modern smartphone. Smartphone users are enabled to compose and share reports immediately at the scene of the disaster. A growing number of modern smartphones also includes sensors for taking pictures and to determine the current geographical position. This additional content can significantly enhance the value of a disaster event report. The project Collaborative, Complex, and Critical Decision-Support in Evolving Crises (TRIDEC), co-funded by the European Commission in its Seventh Framework Programme, is focused on the management of crisis situations. Part of the project is the development of an application for the Android smartphone platform. This application enables access to an continuously updated situation report for current natural disasters like earthquakes and tsunamis based on incoming crowdsourced reports. The App is used to immediately sent

  2. Study on evaluating method for earthquake resisting performance of steel piers; Kosei kyokyaku no taishinsei ni kansuru kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Isoe, A.; Hashimoto, Y.; Morimoto, C.; Sakoda, H.; Ishige, T.; Yoshikawa, T.; Kishida, K. [Kawasaki Heavy Industries, Ltd., Kobe (Japan)

    1998-12-20

    After the shock of Hyogoken Nanbu Earthquake in 1995, protection against level 2 earthquake becomes important subject for civil structures. Subsequently plastic design methods for steel piers have been studied and rapidly introduced. The authors developed a method to evaluate the earthquake resisting performance of a steel pier with a single mass model. This model is useful for design because of its simplicity but on the other hand it can not consider the effects of piers` interaction in space. To include this effect in an analysis a simple 3 dimensional calculation model of box-column pier is developed. (author)

  3. Rapid Response Fault Drilling Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    Demian M. Saffer

    2009-09-01

    Full Text Available New information about large earthquakes can be acquired by drilling into the fault zone quickly following a large seismic event. Specifically, we can learn about the levels of friction and strength of the fault which determine the dynamic rupture, monitor the healing process of the fault, record the stress changes that trigger aftershocks and capture important physical and chemical properties of the fault that control the rupture process. These scientific and associated technical issues were the focus of a three-day workshop on Rapid Response Fault Drilling: Past, Present, and Future, sponsored by the International Continental Scientific Drilling Program (ICDP and the Southern California Earthquake Center (SCEC. The meeting drewtogether forty-four scientists representing ten countries in Tokyo, Japan during November 2008. The group discussed the scientific problems and how they could be addressed through rapid response drilling. Focused talks presented previous work on drilling after large earthquakes and in fault zones in general, as well as the state of the art of experimental techniques and measurement strategies. Detailed discussion weighed the tradeoffs between rapid drilling andthe ability to satisfy a diverse range of scientific objectives. Plausible drilling sites and scenarios were evaluated. This is a shortened summary of the workshop report that discusses key scientific questions, measurement strategies, and recommendations. This report can provide a starting point for quickly mobilizing a drilling program following future large earthquakes. The full report can be seen at http://www.pmc.ucsc.edu/~rapid/.

  4. A Method for Estimation of Death Tolls in Disastrous Earthquake

    Science.gov (United States)

    Pai, C.; Tien, Y.; Teng, T.

    2004-12-01

    Fatality tolls caused by the disastrous earthquake are the one of the most important items among the earthquake damage and losses. If we can precisely estimate the potential tolls and distribution of fatality in individual districts as soon as the earthquake occurrences, it not only make emergency programs and disaster management more effective but also supply critical information to plan and manage the disaster and the allotments of disaster rescue manpower and medicine resources in a timely manner. In this study, we intend to reach the estimation of death tolls caused by the Chi-Chi earthquake in individual districts based on the Attributive Database of Victims, population data, digital maps and Geographic Information Systems. In general, there were involved many factors including the characteristics of ground motions, geological conditions, types and usage habits of buildings, distribution of population and social-economic situations etc., all are related to the damage and losses induced by the disastrous earthquake. The density of seismic stations in Taiwan is the greatest in the world at present. In the meantime, it is easy to get complete seismic data by earthquake rapid-reporting systems from the Central Weather Bureau: mostly within about a minute or less after the earthquake happened. Therefore, it becomes possible to estimate death tolls caused by the earthquake in Taiwan based on the preliminary information. Firstly, we form the arithmetic mean of the three components of the Peak Ground Acceleration (PGA) to give the PGA Index for each individual seismic station, according to the mainshock data of the Chi-Chi earthquake. To supply the distribution of Iso-seismic Intensity Contours in any districts and resolve the problems for which there are no seismic station within partial districts through the PGA Index and geographical coordinates in individual seismic station, the Kriging Interpolation Method and the GIS software, The population density depends on

  5. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  6. Spectral characteristics of natural and artificial earthquakes in the Lop Nor test site, China

    International Nuclear Information System (INIS)

    Korrat, I.M.; Gharib, A.A.; Abou Elenean, K.A.; Hussein, H.M.; ElGabry, M.N.

    2007-12-01

    A seismic discriminants based on the spectral seismogram and spectral magnitude techniques has been tested to discriminate between three events; a nuclear explosion which took place in Lop Nor, China with m b 6.1 and two earthquakes from the closest area with m b 5.5 and 5.3, respectively. The spectral seismogram of the three events shows that the frequency content of the nuclear explosion differs from that of the earthquakes where the P-wave is rich with high frequency content in the nuclear explosion than the corresponding earthquakes. It is also observed that the energy decays very rapidly for the nuclear explosion than that for the earthquakes. Furthermore, the spectral magnitudes reveal significant differences in the spectra between the nuclear explosion and the two earthquakes. These observed differences appear to be quite enough to provide a reliable discriminant. The estimated stress drop from the magnitude spectra indicates a higher stress drop of the nuclear explosion relative to the earthquakes of the same tectonic region. (author)

  7. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  8. Fluid flows due to earthquakes with reference to Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Davies, J.B.

    1993-01-01

    Yucca Mountain geohydrology is dominated by a deep water table in volcanic tuffa beds which are cut by numerous faults. Certain zones in these tuffas and most of the fault apertures are filled with a fine-grained calcitic cement. Earthquakes have occured in this region with the most recent being of magnitude 5.6 and at a distance of about 20 km. Earthquakes in western U.S.A. have been observed to cause fluid flows through and out of the crust of the Earth. These flows are concentrated along the faults with normal faulting producing the largest flows. An earthquake produces rapid pressure changes at and below the ground surface, thereby forcing flows of gas, water, slurries and dissolved salts. In order to examine the properties of flows produced by earthquakes, we simulate the phenomena using computer-based modeling. We investigate the effects of adults and high permeability zones on the pattern of flows induced by the earthquake. We demonstrate that faults act as conduits to the surface and that the higher the permeability of a zone, the more the flows will concentrate there. Numerical estimates of flow rates from these simulations compare favorably with data from observed flows due to earthquakes. Simple volumetric arguments demonstrate the ease with which fluids from the deep water table can reach the surface along fault conduits

  9. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  10. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  11. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  12. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    Science.gov (United States)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  13. Visualization Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Evaluates and improves the operational effectiveness of existing and emerging electronic warfare systems. By analyzing and visualizing simulation results...

  14. Distributed Visualization

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed Visualization allows anyone, anywhere, to see any simulation, at any time. Development focuses on algorithms, software, data formats, data systems and...

  15. Storytelling and Visualization: An Extended Survey

    OpenAIRE

    Chao Tong; Richard Roberts; Rita Borgo; Sean Walton; Robert S. Laramee; Kodzo Wegba; Aidong Lu; Yun Wang; Huamin Qu; Qiong Luo; Xiaojuan Ma

    2018-01-01

    Throughout history, storytelling has been an effective way of conveying information and knowledge. In the field of visualization, storytelling is rapidly gaining momentum and evolving cutting-edge techniques that enhance understanding. Many communities have commented on the importance of storytelling in data visualization. Storytellers tend to be integrating complex visualizations into their narratives in growing numbers. In this paper, we present a survey of storytelling literature in visual...

  16. Renormalization group theory of earthquakes

    Directory of Open Access Journals (Sweden)

    H. Saleur

    1996-01-01

    Full Text Available We study theor