WorldWideScience

Sample records for parkfield ca earthquake

  1. The October 1992 Parkfield, California, earthquake prediction

    Science.gov (United States)

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  2. The USGS plan for short-term prediction of the anticipated Parkfield earthquake

    Science.gov (United States)

    Bakun, W.H.

    1988-01-01

    Aside from the goal of better understanding the Parkfield earthquake cycle, it is the intention of the U.S Geological Survey to attempt to issue a warning shortly before the anticipated earthquake. Although short-term earthquake warnings are not yet generally feasible, the wealth of information available for the previous significant Parkfield earthquakes suggests that if the next earthquake follows the pattern of "characteristic" Parkfield shocks, such a warning might be possible. Focusing on earthquake precursors reported for the previous  "characteristic" shocks, particulary the 1934 and 1966 events, the USGS developed a plan* in late 1985 on which to base earthquake warnings for Parkfield and has assisted State, county, and local officials in the Parkfield area to prepare a coordinated, reasonable response to a warning, should one be issued. 

  3. Acceleration and volumetric strain generated by the Parkfield 2004 earthquake on the GEOS strong-motion array near Parkfield, California

    Science.gov (United States)

    Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher

    2004-01-01

    An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site

  4. Long term electromagnetic monitoring at Parkfield, CA

    Science.gov (United States)

    Kappler, Karl Neil

    Electric and magnetic fields in the (10-4-1.0) Hz band were monitored at two sites adjacent to the San Andreas Fault near Parkfield and Hollister, California. Observed fields typically comprise natural magnetotelluric fields, with cultural and instrument noise. A data window [2002-2005], enclosing the September 28, 2004 M6 Parkfield earthquake, was analyzed to determine if anomalous electric or magnetic fields, or changes in ground conductivity, occurred before the earthquake. The data were edited, removing intervals of instrument malfunction, leaving 875 days left in the four-year period. Frequent, local spike-like disturbances were removed. The distribution of these spikes was not biased around the time of the earthquake. Signal to noise ratios, estimated via magnetotelluric processing techniques, provided an index of data quality. Plots of signal and noise amplitude spectra, showed the behavior of the ULF fields to be remarkably constant over the period of analysis. From these first-order plots, it is clear that most of the recorded energy is coherent over the spatial extent of the array. Three main statistical techniques were employed to separate local anomalous electrical or magnetic fields from the dominant coherent natural fields: transfer function estimates between components at each site were employed to subtract the dominant field, and look deeper at the 'residual' fields; the data were decomposed into principal components to identify linear combinations of array channels, which are maximally uncorrelated; the technique of canonical coherences was employed to distinguish anomalous fields which are spatially broad from anomalies which occur at a single site only, and furthermore to distinguish anomalies which are present in both the electric and magnetic fields form those which are present in only one field type. Standard remote reference apparent resistivity estimates were generated daily at Parkfield. Most of the variation was observed to be seasonal

  5. Searching for evidence of a preferred rupture direction in small earthquakes at Parkfield

    Science.gov (United States)

    Kane, D. L.; Shearer, P. M.; Allmann, B.; Vernon, F. L.

    2009-12-01

    Theoretical modeling of strike-slip ruptures along a bimaterial interface suggests that the interface will have a preferred rupture direction and will produce asymmetric ground motion (Shi and Ben-Zion, 2006). This could have widespread implications for earthquake source physics and for hazard analysis on mature faults because larger ground motions would be expected in the direction of rupture propagation. Studies have shown that many large global earthquakes exhibit unilateral rupture, but a consistently preferred rupture direction along faults has not been observed. Some researchers have argued that the bimaterial interface model does not apply to natural faults, noting that the rupture of the M 6 2004 Parkfield earthquake propagated in the opposite direction from previous M 6 earthquakes along that section of the San Andreas Fault (Harris and Day, 2005). We analyze earthquake spectra from the Parkfield area to look for evidence of consistent rupture directivity along the San Andreas Fault. We separate the earthquakes into spatially defined clusters and quantify the differences in high-frequency energy among earthquakes recorded at each station. Propagation path effects are minimized in this analysis because we compare earthquakes located within a small volume and recorded by the same stations. By considering a number of potential end-member models, we seek to determine if a preferred rupture direction is present among small earthquakes at Parkfield.

  6. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  7. Spatial-temporal variation of low-frequency earthquake bursts near Parkfield, California

    Science.gov (United States)

    Wu, Chunquan; Guyer, Robert; Shelly, David R.; Trugman, D.; Frank, William; Gomberg, Joan S.; Johnson, P.

    2015-01-01

    Tectonic tremor (TT) and low-frequency earthquakes (LFEs) have been found in the deeper crust of various tectonic environments globally in the last decade. The spatial-temporal behaviour of LFEs provides insight into deep fault zone processes. In this study, we examine recurrence times from a 12-yr catalogue of 88 LFE families with ∼730 000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault (SAF) in central California. We apply an automatic burst detection algorithm to the LFE recurrence times to identify the clustering behaviour of LFEs (LFE bursts) in each family. We find that the burst behaviours in the northern and southern LFE groups differ. Generally, the northern group has longer burst duration but fewer LFEs per burst, while the southern group has shorter burst duration but more LFEs per burst. The southern group LFE bursts are generally more correlated than the northern group, suggesting more coherent deep fault slip and relatively simpler deep fault structure beneath the locked section of SAF. We also found that the 2004 Parkfield earthquake clearly increased the number of LFEs per burst and average burst duration for both the northern and the southern groups, with a relatively larger effect on the northern group. This could be due to the weakness of northern part of the fault, or the northwesterly rupture direction of the Parkfield earthquake.

  8. Finite-Source Inversion for the 2004 Parkfield Earthquake using 3D Velocity Model Green's Functions

    Science.gov (United States)

    Kim, A.; Dreger, D.; Larsen, S.

    2008-12-01

    We determine finite fault models of the 2004 Parkfield earthquake using 3D Green's functions. Because of the dense station coverage and detailed 3D velocity structure model in this region, this earthquake provides an excellent opportunity to examine how the 3D velocity structure affects the finite fault inverse solutions. Various studies (e.g. Michaels and Eberhart-Phillips, 1991; Thurber et al., 2006) indicate that there is a pronounced velocity contrast across the San Andreas Fault along the Parkfield segment. Also the fault zone at Parkfield is wide as evidenced by mapped surface faults and where surface slip and creep occurred in the 1966 and the 2004 Parkfield earthquakes. For high resolution images of the rupture process"Ait is necessary to include the accurate 3D velocity structure for the finite source inversion. Liu and Aurchuleta (2004) performed finite fault inversions using both 1D and 3D Green's functions for 1989 Loma Prieta earthquake using the same source paramerization and data but different Green's functions and found that the models were quite different. This indicates that the choice of the velocity model significantly affects the waveform modeling at near-fault stations. In this study, we used the P-wave velocity model developed by Thurber et al (2006) to construct the 3D Green's functions. P-wave speeds are converted to S-wave speeds and density using by the empirical relationships of Brocher (2005). Using a finite difference method, E3D (Larsen and Schultz, 1995), we computed the 3D Green's functions numerically by inserting body forces at each station. Using reciprocity, these Green's functions are recombined to represent the ground motion at each station due to the slip on the fault plane. First we modeled the waveforms of small earthquakes to validate the 3D velocity model and the reciprocity of the Green"fs function. In the numerical tests we found that the 3D velocity model predicted the individual phases well at frequencies lower than 0

  9. Shallow deformation of the San Andreas fault 5 years following the 2004 Parkfield earthquake (Mw6) combining ERS2 and Envisat InSAR.

    Science.gov (United States)

    Bacques, Guillaume; de Michele, Marcello; Raucoules, Daniel; Aochi, Hideo; Rolandone, Frédérique

    2018-04-16

    This study focuses on the shallow deformation that occurred during the 5 years following the Parkfield earthquake (28/09/2004, Mw 6, San Andreas Fault, California). We use Synthetic Aperture Radar interferometry (InSAR) to provide precise measurements of transient deformations after the Parkfield earthquake between 2005 and 2010. We propose a method to combine both ERS2 and ENVISAT interferograms to increase the temporal data sampling. Firstly, we combine 5 years of available Synthetic Aperture Radar (SAR) acquisitions including both ERS-2 and Envisat. Secondly, we stack selected interferograms (both from ERS2 and Envisat) for measuring the temporal evolution of the ground velocities at given time intervals. Thanks to its high spatial resolution, InSAR could provide new insights on the surface fault motion behavior over the 5 years following the Parkfield earthquake. As a complement to previous studies in this area, our results suggest that shallow transient deformations affected the Creeping-Parkfield-Cholame sections of the San Andreas Fault after the 2004 Mw6 Parkfield earthquake.

  10. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  11. Seismomagnetic effects from the long-awaited 28 September 2004 M 6.0 parkfield earthquake

    Science.gov (United States)

    Johnston, M.J.S.; Sasai, Y.; Egbert, G.D.; Mueller, R.J.

    2006-01-01

    Precise measurements of local magnetic fields have been obtained with a differentially connected array of seven synchronized proton magnetometers located along 60 km of the locked-to-creeping transition region of the San Andreas fault at Parkfield, California, since 1976. The M 6.0 Parkfield earthquake on 28 September 2004, occurred within this array and generated coseismic magnetic field changes of between 0.2 and 0.5 nT at five sites in the network. No preseismic magnetic field changes exceeding background noise levels are apparent in the magnetic data during the month, week, and days before the earthquake (or expected in light of the absence of measurable precursive deformation, seismicity, or pore pressure changes). Observations of electric and magnetic fields from 0.01 to 20 Hz are also made at one site near the end of the earthquake rupture and corrected for common-mode signals from the ionosphere/magnetosphere using a second site some 115 km to the northwest along the fault. These magnetic data show no indications of unusual noise before the earthquake in the ULF band (0.01-20 Hz) as suggested may have preceded the 1989 ML 7.1 Loma Prieta earthquake. Nor do we see electric field changes similar to those suggested to occur before earthquakes of this magnitude from data in Greece. Uniform and variable slip piezomagnetic models of the earthquake, derived from strain, displacement, and seismic data, generate magnetic field perturbations that are consistent with those observed by the magnetometer array. A higher rate of longer-term magnetic field change, consistent with increased loading in the region, is apparent since 1993. This accompanied an increased rate of secular shear strain observed on a two-color EDM network and a small network of borehole tensor strainmeters and increased seismicity dominated by three M 4.5-5 earthquakes roughly a year apart in 1992, 1993, and 1994. Models incorporating all of these data indicate increased slip at depth in the region

  12. Frequency-Dependent Tidal Triggering of Low Frequency Earthquakes Near Parkfield, California

    Science.gov (United States)

    Xue, L.; Burgmann, R.; Shelly, D. R.

    2017-12-01

    The effect of small periodic stress perturbations on earthquake generation is not clear, however, the rate of low-frequency earthquakes (LFEs) near Parkfield, California has been found to be strongly correlated with solid earth tides. Laboratory experiments and theoretical analyses show that the period of imposed forcing and source properties affect the sensitivity to triggering and the phase relation of the peak seismicity rate and the periodic stress, but frequency-dependent triggering has not been quantitatively explored in the field. Tidal forcing acts over a wide range of frequencies, therefore the sensitivity to tidal triggering of LFEs provides a good probe to the physical mechanisms affecting earthquake generation. In this study, we consider the tidal triggering of LFEs near Parkfield, California since 2001. We find the LFEs rate is correlated with tidal shear stress, normal stress rate and shear stress rate. The occurrence of LFEs can also be independently modulated by groups of tidal constituents at semi-diurnal, diurnal and fortnightly frequencies. The strength of the response of LFEs to the different tidal constituents varies between LFE families. Each LFE family has an optimal triggering frequency, which does not appear to be depth dependent or systematically related to other known properties. This suggests the period of the applied forcing plays an important role in the triggering process, and the interaction of periods of loading history and source region properties, such as friction, effective normal stress and pore fluid pressure, produces the observed frequency-dependent tidal triggering of LFEs.

  13. Stability and uncertainty of finite-fault slip inversions: Application to the 2004 Parkfield, California, earthquake

    Science.gov (United States)

    Hartzell, S.; Liu, P.; Mendoza, C.; Ji, C.; Larson, K.M.

    2007-01-01

    The 2004 Parkfield, California, earthquake is used to investigate stability and uncertainty aspects of the finite-fault slip inversion problem with different a priori model assumptions. We utilize records from 54 strong ground motion stations and 13 continuous, 1-Hz sampled, geodetic instruments. Two inversion procedures are compared: a linear least-squares subfault-based methodology and a nonlinear global search algorithm. These two methods encompass a wide range of the different approaches that have been used to solve the finite-fault slip inversion problem. For the Parkfield earthquake and the inversion of velocity or displacement waveforms, near-surface related site response (top 100 m, frequencies above 1 Hz) is shown to not significantly affect the solution. Results are also insensitive to selection of slip rate functions with similar duration and to subfault size if proper stabilizing constraints are used. The linear and nonlinear formulations yield consistent results when the same limitations in model parameters are in place and the same inversion norm is used. However, the solution is sensitive to the choice of inversion norm, the bounds on model parameters, such as rake and rupture velocity, and the size of the model fault plane. The geodetic data set for Parkfield gives a slip distribution different from that of the strong-motion data, which may be due to the spatial limitation of the geodetic stations and the bandlimited nature of the strong-motion data. Cross validation and the bootstrap method are used to set limits on the upper bound for rupture velocity and to derive mean slip models and standard deviations in model parameters. This analysis shows that slip on the northwestern half of the Parkfield rupture plane from the inversion of strong-motion data is model dependent and has a greater uncertainty than slip near the hypocenter.

  14. Strong-Motion Data From the Parkfield Earthquake of September 28, 2004

    Science.gov (United States)

    Shakal, A. F.; Borcherdt, R. D.; Graizer, V.; Haddadi, H.; Huang, M.; Lin, K.; Stephens, C.

    2004-12-01

    Very complex ground motion with high spatial variability was recorded in the near field of the M6 Parkfield earthquake of 9/28/04 by a strong motion array. The array provided the highest density of recording stations in the near field of any earthquake recorded to date. A total of 56 stations were located within 20 km of the fault; 48 were within 10 km of the fault, more than for many other earthquakes combined. Most (45) of the stations were part of a specialized array of classic analog instruments installed by CGS in the early 1980s, and 11 were digital high resolution instruments installed by the USGS. The set of recordings obtained provide a wealth of information on near field ground motion. Processing and analysis of the strong-motion data, available at www.cisn-edc.org, is underway. The spatial variation of the ground motion, even over relatively short distances, is great. For example, a peak acceleration of 0.30 g was recorded in the town of Parkfield, but several stations, within about 2 km, that surround this station recorded acceleration levels well over 1 g. The strong shaking at these stations, near the termination end of the rupture, is consistent with directivity focusing, as the rupture propagated from the epicenter near Gold Hill to the northwest. However, some of the strongest shaking occurs well south of the rupture, at stations near Hwy 46 at the south end of the Cholame Valley, incompatible with directivity focusing from a simple rupture. An additional aspect is that several near-fault stations have very low shaking, despite being directly over the rupturing fault. This may provide a quantitative basis to understand observed cases of low-strength buildings immediately near a fault being only slightly damaged.

  15. Teleseismically-induced tremor near Parkfield, CA - a cacophony or a symphony?

    Science.gov (United States)

    Vidale, J. E.; Peng, Z.; Creager, K. C.; Bodin, P.

    2007-12-01

    The tremor triggered near Parkfield, CA by the 2002 Denali and 2004 Sumatra earthquakes was strong and well recorded by the dense regional CISN and the borehole HRSN networks. Peng et al. (this meeting) survey tremors triggered by a larger set of 12 regional and teleseismic events, providing a broader context. In the case of both the 2002 M7.9 Denali and 2004 M9.1 Sumatra earthquakes, the tremor emanates from at least two source regions deep within the SAF. The first source region is 40 km NW of the SAFOD in the creeping section of the SAF, and the second region is 40 km SE of the SAFOD near Cholame, close to the location where most of the non-triggered tremor has been found previously (Nadeau and Dolenc, Science, 2005). The Denali earthquake triggered tremor is in phase with the surface waves for about 400s. The northern region started tremoring first by about 100s, and both regions quieted before the end of the surface waves. The wavetrain for the 2004 M9.1 Sumatra earthquake was long enough that tremors were also excited by the weak diffracted P waves, and tremor turned up the volume for an hour upon the arrival of the surface waves, underwent a sudden and curious hiatus for 500s before the end of the surface waves, then re-started and continued for at least an hour after the passage of the surface waves. It is easy to suggest that the tremor was accompanied by deep slip on the SAF, but creep and strain data indicate any slip was too small to generate a detectable surface deformation. These observations suggest a component of driven, instantaneous, perhaps Coulomb-friction response with an added dose of self-sustaining, dribbling activity more suggestive of the oozing of fluids.

  16. Seismicity Precursors of the M6.0 2004 Parkfield and M7.0 1989Loma Prieta Earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Korneev, Valeri A.

    2006-03-09

    The M6.0 2004 Parkfield and M7.0 1989 Loma Prietastrike-slip earthquakes on the San Andreas Fault (SAF) were preceded byseismicity peaks occurring several months prior to the main events.Earthquakes directly within the SAF zone were intentionally excluded fromthe analysis because they manifest stress-release processes rather thanstress accumulation. The observed increase in seismicity is interpretedas a signature of the increasing stress level in the surrounding crust,whereas the peaks and the subsequent decrease in seismicity areattributed to damage-induced softening processes. Furthermore, in bothcases there is a distinctive zone of low seismic activity that surroundsthe epicentral region in the pre-event period. The increase of seismicityin the crust surrounding a potential future event and the development ofa low-seismicity epicentral zone can be regarded as promising precursoryinformation that could help signal the arrival of large earthquakes. TheGutenberg-Richter relationship (GRR) should allow extrapolation ofseismicity changes down to seismic noise level magnitudes. Thishypothesis is verified by comparison of seismic noise at 80 Hz with theParkfield M4 1993-1994 series, where noise peaks 5 months before theseries to about twice the background level.

  17. 3-D P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    Science.gov (United States)

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-01-01

    To refine the 3-D seismic velocity model in the greater Parkfield, California region, a new data set including regular earthquakes, shots, quarry blasts and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time–frequency domain phase weighted stacking method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behaviour. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new autopicker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. We relocate LFE families and analyse the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  18. Incorporating fault zone head wave and direct wave secondary arrival times into seismic tomography: Application at Parkfield, California

    Science.gov (United States)

    Bennington, Ninfa L.; Thurber, Clifford; Peng, Zhigang; Zhang, Haijiang; Zhao, Peng

    2013-03-01

    We present a three-dimensional (3D) P wave velocity (Vp) model of the Parkfield region that utilizes existing P wave arrival time data, including fault zone head waves (FZHWs), and data from direct wave secondary arrivals (DWSAs). The first-arrival and DWSA travel times are obtained as the global- and local-minimum travel time paths, respectively. The inclusion of FZHWs and DWSAs results in as much as a 5% and a 10% increase in the across-fault velocity contrast, respectively, for the Vp model at Parkfield relative to that of Thurber et al. [2006]. Viewed along strike, three pronounced velocity contrast regions are observed: a pair of strong positive velocity contrasts (SW fast), one NW of the 1966 Parkfield earthquake hypocenter and the other SE of the 2004 Parkfield earthquake hypocenter, and a strong negative velocity contrast (NE fast) between the two hypocenters. The negative velocity contrast partially to entirely encompasses peak coseismic slip estimated in several slip models for the 2004 earthquake, suggesting that the negative velocity contrast played a part in defining the rupture patch of the 2004 Parkfield earthquake. Following Ampuero and Ben-Zion (2008), the pattern of velocity contrasts is consistent with the observed bilateral rupture propagation for the 2004 Parkfield earthquake. Although the velocity contrasts also suggest bilateral rupture propagation for the 1966 Parkfield earthquake, the fault is creeping to the NW here, i.e., exhibiting velocity-strengthening behavior. Thus, it is not surprising that rupture propagated only SE during this event.

  19. Postseismic relaxation along the San Andreas fault at Parkfield from continuous seismological observations.

    Science.gov (United States)

    Brenguier, F; Campillo, M; Hadziioannou, C; Shapiro, N M; Nadeau, R M; Larose, E

    2008-09-12

    Seismic velocity changes and nonvolcanic tremor activity in the Parkfield area in California reveal that large earthquakes induce long-term perturbations of crustal properties in the San Andreas fault zone. The 2003 San Simeon and 2004 Parkfield earthquakes both reduced seismic velocities that were measured from correlations of the ambient seismic noise and induced an increased nonvolcanic tremor activity along the San Andreas fault. After the Parkfield earthquake, velocity reduction and nonvolcanic tremor activity remained elevated for more than 3 years and decayed over time, similarly to afterslip derived from GPS (Global Positioning System) measurements. These observations suggest that the seismic velocity changes are related to co-seismic damage in the shallow layers and to deep co-seismic stress change and postseismic stress relaxation within the San Andreas fault zone.

  20. Delayed dynamic triggering of deep tremor along the Parkfield-Cholame section of the San Andreas Fault following the 2014 M6.0 South Napa earthquake

    Science.gov (United States)

    Peng, Zhigang; Shelly, David R.; Ellsworth, William L.

    2015-01-01

    Large, distant earthquakes are known to trigger deep tectonic tremor along the San Andreas Fault and in subduction zones. However, there are relatively few observations of triggering from regional distance earthquakes. Here we show that a small tremor episode about 12–18 km NW of Parkfield was triggered during and immediately following the passage of surface waves from the 2014 Mw 6.0 South Napa main shock. More notably, a major tremor episode followed, beginning about 12 h later, and centered SE of Parkfield near Cholame. This major episode is one of the largest seen over the past several years, containing intense activity for ~3 days and taking more than 3 weeks to return to background levels. This episode showed systematic along-strike migration at ~5 km/d, suggesting that it was driven by a slow-slip event. Our results suggest that moderate-size earthquakes are capable of triggering major tremor and deep slow slip at regional distances.

  1. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  2. Aftershock distribution as a constraint on the geodetic model of coseismic slip for the 2004 Parkfield earthquake

    Science.gov (United States)

    Bennington, Ninfa; Thurber, Clifford; Feigl, Kurt; ,

    2011-01-01

    Several studies of the 2004 Parkfield earthquake have linked the spatial distribution of the event’s aftershocks to the mainshock slip distribution on the fault. Using geodetic data, we find a model of coseismic slip for the 2004 Parkfield earthquake with the constraint that the edges of coseismic slip patches align with aftershocks. The constraint is applied by encouraging the curvature of coseismic slip in each model cell to be equal to the negative of the curvature of seismicity density. The large patch of peak slip about 15 km northwest of the 2004 hypocenter found in the curvature-constrained model is in good agreement in location and amplitude with previous geodetic studies and the majority of strong motion studies. The curvature-constrained solution shows slip primarily between aftershock “streaks” with the continuation of moderate levels of slip to the southeast. These observations are in good agreement with strong motion studies, but inconsistent with the majority of published geodetic slip models. Southeast of the 2004 hypocenter, a patch of peak slip observed in strong motion studies is absent from our curvature-constrained model, but the available GPS data do not resolve slip in this region. We conclude that the geodetic slip model constrained by the aftershock distribution fits the geodetic data quite well and that inconsistencies between models derived from seismic and geodetic data can be attributed largely to resolution issues.

  3. Slip deficit on the san andreas fault at parkfield, california, as revealed by inversion of geodetic data.

    Science.gov (United States)

    Segall, P; Harris, R

    1986-09-26

    A network of geodetic lines spanning the San Andreas fault near the rupture zone of the 1966 Parkfield, California, earthquake (magnitude M = 6) has been repeatedly surveyed since 1959. In the study reported here the average rates of line-length change since 1966 were inverted to determine the distribution of interseismic slip rate on the fault. These results indicate that the Parkfield rupture surface has not slipped significantly since 1966. Comparison of the geodetically determined seismic moment of the 1966 earthquake with the interseismic slip-deficit rate suggests that the strain released by the latest shock will most likely be restored between 1984 and 1989, although this may not occur until 1995. These results lend independent support to the earlier forecast of an M = 6 earthquake near Parkfield within 5 years of 1988.

  4. The incorporation of fault zone head wave and direct wave secondary arrival times and arrival polarizations into seismic tomography: Application to the Parkfield, California area

    Science.gov (United States)

    Bennington, N. L.; Thurber, C. H.; Peng, Z.; Zhao, P.

    2012-12-01

    We present a 3D P-wave velocity (Vp) model of the Parkfield region that utilizes existing P-wave arrival time data, including fault zone head waves (FZHW), plus new data from direct wave secondary arrivals (DWSA). The first-arrival and DWSA travel times are obtained as the global and local minimum travel time paths, respectively. The inclusion of DWSA results in as much as a 10% increase in the across-fault velocity contrast for the Vp model at Parkfield relative to Thurber et al. (2006). Viewed along strike, three pronounced velocity contrast regions are observed: a pair of strong positive velocity contrasts (SW fast), one NW of the 1966 Parkfield hypocenter and the other SE of the 2004 Parkfield hypocenter, and a strong negative velocity contrast (NE fast) between the two hypocenters. The negative velocity contrast partially to entirely encompasses peak coseismic slip estimated in several slip models for the 2004 earthquake, suggesting that the negative velocity contrast played a part in defining the rupture patch of the 2004 Parkfield earthquake. We expand on this work by modifying our seismic tomography algorithm to incorporate arrival polarizations (azimuths). Synthetic tests will be presented to demonstrate the improvements in velocity structure when arrival polarizations are incorporated. These tests will compare the synthetic model recovered when FZHW/DWSA arrivals as well as existing P-wave arrival time data are inverted to that recovered with the same dataset with the inclusion of arrival polarizations. We plan to extend this work to carry out a full scale seismic tomography/relocation inversion at Parkfield, CA utilizing arrival polarizations from all first-P arrivals, and FZHW/DWSA arrivals as well as existing P-wave arrival time data. This effort requires the determination of polarization data for all P-waves and FZHW's at Parkfield. To this end, we use changes in the arrival azimuth from fault normal to source-receiver direction to identify FZHW and

  5. Remote triggering of fault-strength changes on the San Andreas fault at Parkfield.

    Science.gov (United States)

    Taira, Taka'aki; Silver, Paul G; Niu, Fenglin; Nadeau, Robert M

    2009-10-01

    Fault strength is a fundamental property of seismogenic zones, and its temporal changes can increase or decrease the likelihood of failure and the ultimate triggering of seismic events. Although changes in fault strength have been suggested to explain various phenomena, such as the remote triggering of seismicity, there has been no means of actually monitoring this important property in situ. Here we argue that approximately 20 years of observation (1987-2008) of the Parkfield area at the San Andreas fault have revealed a means of monitoring fault strength. We have identified two occasions where long-term changes in fault strength have been most probably induced remotely by large seismic events, namely the 2004 magnitude (M) 9.1 Sumatra-Andaman earthquake and the earlier 1992 M = 7.3 Landers earthquake. In both cases, the change possessed two manifestations: temporal variations in the properties of seismic scatterers-probably reflecting the stress-induced migration of fluids-and systematic temporal variations in the characteristics of repeating-earthquake sequences that are most consistent with changes in fault strength. In the case of the 1992 Landers earthquake, a period of reduced strength probably triggered the 1993 Parkfield aseismic transient as well as the accompanying cluster of four M > 4 earthquakes at Parkfield. The fault-strength changes produced by the distant 2004 Sumatra-Andaman earthquake are especially important, as they suggest that the very largest earthquakes may have a global influence on the strength of the Earth's fault systems. As such a perturbation would bring many fault zones closer to failure, it should lead to temporal clustering of global seismicity. This hypothesis seems to be supported by the unusually high number of M >or= 8 earthquakes occurring in the few years following the 2004 Sumatra-Andaman earthquake.

  6. Preliminary analysis of strong-motion recordings from the 28 September 2004 Parkfield, California earthquake

    Science.gov (United States)

    Shakal, A.; Graizer, V.; Huang, M.; Borcherdt, R.; Haddadi, H.; Lin, K.-W.; Stephens, C.; Roffers, P.

    2005-01-01

    The Parkfield 2004 earthquake yielded the most extensive set of strong-motion data in the near-source region of a magnitude 6 earthquake yet obtained. The recordings of acceleration and volumetric strain provide an unprecedented document of the near-source seismic radiation for a moderate earthquake. The spatial density of the measurements alon g the fault zone and in the linear arrays perpendicular to the fault is expected to provide an exceptional opportunity to develop improved models of the rupture process. The closely spaced measurements should help infer the temporal and spatial distribution of the rupture process at much higher resolution than previously possible. Preliminary analyses of the peak a cceleration data presented herein shows that the motions vary significantly along the rupture zone, from 0.13 g to more than 2.5 g, with a map of the values showing that the larger values are concentrated in three areas. Particle motions at the near-fault stations are consistent with bilateral rupture. Fault-normal pulses similar to those observed in recent strike-slip earthquakes are apparent at several of the stations. The attenuation of peak ground acceleration with distance is more rapid than that indicated by some standard relationships but adequately fits others. Evidence for directivity in the peak acceleration data is not strong. Several stations very near, or over, the rupturing fault recorded relatively low accelerations. These recordings may provide a quantitative basis to understand observations of low near-fault shaking damage that has been reported in other large strike-slip earthquak.

  7. Stress diffusion along the san andreas fault at parkfield, california.

    Science.gov (United States)

    Malin, P E; Alvarez, M G

    1992-05-15

    Beginning in January 1990, the epicenters of microearthquakes associated with a 12-month increase in seismicity near Parkfield, California, moved northwest to southeast along the San Andreas fault. During this sequence of events, the locally variable rate of cumulative seismic moment increased. This increase implies a local increase in fault slip. These data suggest that a southeastwardly diffusing stress front propagated along the San Andreas fault at a speed of 30 to 50 kilometers per year. Evidently, this front did not load the Parkfield asperities fast enough to produce a moderate earthquake; however, a future front might do so.

  8. Acoustic Emission Precursors of M6.0 2004 Parkfield and M7.0 1989Loma Prieta Earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Korneev, Valeri

    2005-02-01

    Two recent strike-slip earthquakes on the San Andreas Fault(SAF) in California, the M6.0 2004 Parkfield and M7.0 1989 Loma Prietaevents, revealed peaks in the acoustic emission (AE) activity in thesurrounding crust several months prior to the main events. Earthquakesdirectly within the SAF zone were intentionally excluded from theanalysis. The observed increase in AE is assumed to be a signature of theincreasing stress level in the surrounding crust, while the peak andsubsequent decrease in AE starting several months prior to the mainevents is attributed to damage-induced softening processes as discussedherein. Further, distinctive zones of low seismic activity surroundingthe epicentral regions in the pre-event time period are present for thetwo studied events. Both AE increases in the crust surrounding apotential future event and the development of a low-seismicity epicentralzone can be regarded as promising precursory information that could helpsignal the arrival of large earthquakes.

  9. Tremor reveals stress shadowing, deep postseismic creep, and depth-dependent slip recurrence on the lower-crustal San Andreas fault near Parkfield

    Science.gov (United States)

    Shelly, David R.; Johnson, Kaj M.

    2011-01-01

    The 2003 magnitude 6.5 San Simeon and the 2004 magnitude 6.0 Parkfield earthquakes induced small, but significant, static stress changes in the lower crust on the central San Andreas fault, where recently detected tectonic tremor sources provide new constraints on deep fault creep processes. We find that these earthquakes affect tremor rates very differently, consistent with their differing transferred static shear stresses. The San Simeon event appears to have cast a "stress shadow" north of Parkfield, where tremor activity was stifled for 3-6 weeks. In contrast, the 2004 Parkfield earthquake dramatically increased tremor activity rates both north and south of Parkfield, allowing us to track deep postseismic slip. Following this event, rates initially increased by up to two orders of magnitude for the relatively shallow tremor sources closest to the rupture, with activity in some sources persisting above background rates for more than a year. We also observe strong depth dependence in tremor recurrence patterns, with shallower sources generally exhibiting larger, less-frequent bursts, possibly signaling a transition toward steady creep with increasing temperature and depth. Copyright 2011 by the American Geophysical Union.

  10. Searching for geodetic transient slip signals along the Parkfield segment of the San Andreas Fault

    Science.gov (United States)

    Rousset, B.; Burgmann, R.

    2017-12-01

    The Parkfield section of the San Andreas fault is at the transition between a segment locked since the 1857 Mw 7.9 Fort Tejon earthquake to its south and a creeping segment to the north. It is particularly well instrumented since it is the many previous studies have focused on studying the coseismic and postseismic phases of the two most recent earthquake cycles, the interseismic phase is exhibiting interesting dynamics at the down-dip edge of the seismogenic zone, characterized by a very large number of low frequency earthquakes (LFE) with different behaviors depending on location. Interseismic fault creep rates appear to vary over a wide range of spatial and temporal scales, from the Earth's surface to the base of crust. In this study, we take advantage of the dense Global Positioning System (GPS) network, with 77 continuous stations located within a circle of radius 80 km centered on Parkfield. We correct these time series for the co- and postseismic signals of the 2003 Mw 6.3 San Simeon and 2004 Mw 6.0 Parkfield earthquakes. We then cross-correlate the residual time series with synthetic slow-slip templates following the approach of Rousset et al. (2017). Synthetic tests with transient events contained in GPS time series with realistic noise show the limit of detection of the method. In the application with real GPS time series, the highest correlation amplitudes are compared with micro-seismicity rates, as well as tremor and LFE observations.

  11. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng; Hainzl, Sebastian; Mai, Paul Martin

    2015-01-01

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  12. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng

    2015-11-11

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  13. Clustering and periodic recurrence of microearthquakes on the san andreas fault at parkfield, california.

    Science.gov (United States)

    Nadeau, R M; Foxall, W; McEvilly, T V

    1995-01-27

    The San Andreas fault at Parkfield, California, apparently late in an interval between repeating magnitude 6 earthquakes, is yielding to tectonic loading partly by seismic slip concentrated in a relatively sparse distribution of small clusters (<20-meter radius) of microearthquakes. Within these clusters, which account for 63% of the earthquakes in a 1987-92 study interval, virtually identical small earthquakes occurred with a regularity that can be described by the statistical model used previously in forecasting large characteristic earthquakes. Sympathetic occurrence of microearthquakes in nearby clusters was observed within a range of about 200 meters at communication speeds of 10 to 100 centimeters per second. The rate of earthquake occurrence, particularly at depth, increased significantly during the study period, but the fraction of earthquakes that were cluster members decreased.

  14. Determination of Focal Depths of Earthquakes in the Mid-Oceanic Ridges from Amplitude Spectra of Surface Waves

    Science.gov (United States)

    1969-06-01

    Foreshock , mainshock and aftershock of the Parkfield, California earthquake of June 28, 1966. b. The Denver earthquake of August 9, 1967. Let us look...into the results of these tests in more details. (1) Test on the main shock, foreshock and aftershock of the Parkfield earthquake of June 28, 1966...According to McEvilly et. al. (1967), the origin times and locations of.these events were the following: Foreshock June 28, 1966, 04:08:56.2 GMT; 350 57.6

  15. Tidal triggering of low frequency earthquakes near Parkfield, California: Implications for fault mechanics within the brittle-ductile transition

    Science.gov (United States)

    Thomas, A.M.; Burgmann, R.; Shelly, David R.; Beeler, Nicholas M.; Rudolph, M.L.

    2012-01-01

    Studies of nonvolcanic tremor (NVT) have established the significant impact of small stress perturbations on NVT generation. Here we analyze the influence of the solid earth and ocean tides on a catalog of ∼550,000 low frequency earthquakes (LFEs) distributed along a 150 km section of the San Andreas Fault centered at Parkfield. LFE families are identified in the NVT data on the basis of waveform similarity and are thought to represent small, effectively co-located earthquakes occurring on brittle asperities on an otherwise aseismic fault at depths of 16 to 30 km. We calculate the sensitivity of each of these 88 LFE families to the tidally induced right-lateral shear stress (RLSS), fault-normal stress (FNS), and their time derivatives and use the hypocentral locations of each family to map the spatial variability of this sensitivity. LFE occurrence is most strongly modulated by fluctuations in shear stress, with the majority of families demonstrating a correlation with RLSS at the 99% confidence level or above. Producing the observed LFE rate modulation in response to shear stress perturbations requires low effective stress in the LFE source region. There are substantial lateral and vertical variations in tidal shear stress sensitivity, which we interpret to reflect spatial variation in source region properties, such as friction and pore fluid pressure. Additionally, we find that highly episodic, shallow LFE families are generally less correlated with tidal stresses than their deeper, continuously active counterparts. The majority of families have weaker or insignificant correlation with positive (tensile) FNS. Two groups of families demonstrate a stronger correlation with fault-normal tension to the north and with compression to the south of Parkfield. The families that correlate with fault-normal clamping coincide with a releasing right bend in the surface fault trace and the LFE locations, suggesting that the San Andreas remains localized and contiguous down

  16. San andreas fault zone head waves near parkfield, california.

    Science.gov (United States)

    Ben-Zion, Y; Malin, P

    1991-03-29

    Microearthquake seismograms from the borehole seismic network on the San Andreas fault near Parkfield, California, provide three lines of evidence that first P arrivals are "head" waves refracted along the cross-fault material contrast. First, the travel time difference between these arrivals and secondary phases identified as direct P waves scales linearly with the source-receiver distance. Second, these arrivals have the emergent wave character associated in theory and practice with refracted head waves instead of the sharp first breaks associated with direct P arrivals. Third, the first motion polarities of the emergent arrivals are reversed from those of the direct P waves as predicted by the theory of fault zone head waves for slip on the San Andreas fault. The presence of fault zone head waves in local seismic network data may help account for scatter in earthquake locations and source mechanisms. The fault zone head waves indicate that the velocity contrast across the San Andreas fault near Parkfield is approximately 4 percent. Further studies of these waves may provide a way of assessing changes in the physical state of the fault system.

  17. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    Science.gov (United States)

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  18. Analysis of nonvolcanic tremor on the San Andreas Fault near Parkfield, CA using U.S. Geological Survey Parkfield Seismic Array

    Science.gov (United States)

    Fletcher, Jon B.; Baker, Lawrence M.

    2010-01-01

    background tremor signal and lasts about 5 s. These impulsive wavelets are similar to low-frequency earthquakes signals seen in Japan but appear to be broader band rather than just higher in low-frequency energy. They may be more appropriately called high-energy tremor (HET). HET signals at UPSAR correlate well with the record of this event from station GHIB of the HRSN borehole array at Parkfield and HETs typically have a higher cross-correlation coefficient than the rest of the tremor event. The amplitudes of a large HET are consistent with a magnitude of 0.1 when compared with a M2.3 event that had about the same epicenter. Polarizations of the tremor episode at UPSAR are mostly just north of east. Both linearity and azimuth evolve over time suggesting a change in tremor source location over time and linearity is typically higher at the HETs.

  19. Analysis of nonvolcanic tremor on the San Andreas fault near Parkfield, CA using U. S. Geological Survey Parkfield Seismic Array

    Science.gov (United States)

    Fletcher, Jon B.; Baker, Lawrence M.

    2010-10-01

    tremor signal and lasts about 5 s. These impulsive wavelets are similar to low-frequency earthquakes signals seen in Japan but appear to be broader band rather than just higher in low-frequency energy. They may be more appropriately called high-energy tremor (HET). HET signals at UPSAR correlate well with the record of this event from station GHIB of the HRSN borehole array at Parkfield and HETs typically have a higher cross-correlation coefficient than the rest of the tremor event. The amplitudes of a large HET are consistent with a magnitude of 0.1 when compared with a M2.3 event that had about the same epicenter. Polarizations of the tremor episode at UPSAR are mostly just north of east. Both linearity and azimuth evolve over time suggesting a change in tremor source location over time and linearity is typically higher at the HETs.

  20. Heterogeneous slip and rupture models of the San Andreas fault zone based upon three-dimensional earthquake tomography

    Energy Technology Data Exchange (ETDEWEB)

    Foxall, William [Univ. of California, Berkeley, CA (United States)

    1992-11-01

    Crystal fault zones exhibit spatially heterogeneous slip behavior at all scales, slip being partitioned between stable frictional sliding, or fault creep, and unstable earthquake rupture. An understanding the mechanisms underlying slip segmentation is fundamental to research into fault dynamics and the physics of earthquake generation. This thesis investigates the influence that large-scale along-strike heterogeneity in fault zone lithology has on slip segmentation. Large-scale transitions from the stable block sliding of the Central 4D Creeping Section of the San Andreas, fault to the locked 1906 and 1857 earthquake segments takes place along the Loma Prieta and Parkfield sections of the fault, respectively, the transitions being accomplished in part by the generation of earthquakes in the magnitude range 6 (Parkfield) to 7 (Loma Prieta). Information on sub-surface lithology interpreted from the Loma Prieta and Parkfield three-dimensional crustal velocity models computed by Michelini (1991) is integrated with information on slip behavior provided by the distributions of earthquakes located using, the three-dimensional models and by surface creep data to study the relationships between large-scale lithological heterogeneity and slip segmentation along these two sections of the fault zone.

  1. S-wave triggering of tremor beneath the Parkfield, California, section of the San Andreas fault by the 2011 Tohoku, Japan earthquake: observations and theory

    Science.gov (United States)

    Hill, David P.; Peng, Zhigang; Shelly, David R.; Aiken, Chastity

    2013-01-01

    The dynamic stresses that are associated with the energetic seismic waves generated by the Mw 9.0 Tohoku earthquake off the northeast coast of Japan triggered bursts of tectonic tremor beneath the Parkfield section of the San Andreas fault (SAF) at an epicentral distance of ∼8200  km. The onset of tremor begins midway through the ∼100‐s‐period S‐wave arrival, with a minor burst coinciding with the SHSH arrival, as recorded on the nearby broadband seismic station PKD. A more pronounced burst coincides with the Love arrival, followed by a series of impulsive tremor bursts apparently modulated by the 20‐ to 30‐s‐period Rayleigh wave. The triggered tremor was located at depths between 20 and 30 km beneath the surface trace of the fault, with the burst coincident with the S wave centered beneath the fault 30 km northwest of Parkfield. Most of the subsequent activity, including the tremor coincident with the SHSH arrival, was concentrated beneath a stretch of the fault extending from 10 to 40 km southeast of Parkfield. The seismic waves from the Tohoku epicenter form a horizontal incidence angle of ∼14°, with respect to the local strike of the SAF. Computed peak dynamic Coulomb stresses on the fault at tremor depths are in the 0.7–10 kPa range. The apparent modulation of tremor bursts by the small, strike‐parallel Rayleigh‐wave stresses (∼0.7  kPa) is likely enabled by pore pressure variations driven by the Rayleigh‐wave dilatational stress. These results are consistent with the strike‐parallel dynamic stresses (δτs) associated with the S, SHSH, and surface‐wave phases triggering small increments of dextral slip on the fault with a low friction (μ∼0.2). The vertical dynamic stresses δτd do not trigger tremor with vertical or oblique slip under this simple Coulomb failure model.

  2. Satellite Infrared Radiation Measurements Prior to the Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulintes, S.; Bryant, N.; Taylor, Patrick; Freund, F.

    2005-01-01

    This work describes our search for a relationship between tectonic stresses and increases in mid-infrared (IR) flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. We present and &scuss observed variations in thermal transients and radiation fields prior to the earthquakes of Jan 22, 2003 Colima (M6.7) Mexico, Sept. 28 .2004 near Parkfield (M6.0) in California and Northern Sumatra (M8.5) Dec. 26,2004. Previous analysis of earthquake events has indicated the presence of an IR anomaly, where temperatures increased or did not return to its usual nighttime value. Our procedures analyze nighttime satellite data that records the general condtion of the ground after sunset. We have found from the MODIS instrument data that five days before the Colima earthquake the IR land surface nighttime temperature rose up to +4 degrees C in a 100 km radius around the epicenter. The IR transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +1 degree C and is significantly smaller than the IR anomaly around the Colima epicenter. Ground surface temperatures near the Parkfield epicenter four days prior to the earthquake show steady increase. However, on the night preceding the quake, a significant drop in relative humidity was indicated, process similar to those register prior to the Colima event. Recent analyses of continuous ongoing long- wavelength Earth radiation (OLR) indicate significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and/or gas composition prior to the earthquake. The OLR anomaly usually covers large areas surrounding the main epicenter. We have found strong anomalies signal (two sigma) along the epicentral area signals on Dec 21

  3. Postearthquake relaxation after the 2004 M6 Parkfield, California, earthquake and rate-and-state friction

    Science.gov (United States)

    Savage, J.C.; Langbein, J.

    2008-01-01

    An unusually complete set of measurements (including rapid rate GPS over the first 10 days) of postseismic deformation is available at 12 continuous GPS stations located close to the epicenter of the 2004 M6.0 Parkfield earthquake. The principal component modes for the relaxation of the ensemble of those 12 GPS stations were determined. The first mode alone furnishes an adequate approximation to the data. Thus, the relaxation at all stations can be represented by the product of a common temporal function and distinct amplitudes for each component (north or east) of relaxation at each station. The distribution in space of the amplitudes indicates that the relaxation is dominantly strike slip. The temporal function, which spans times from about 5 min to 900 days postearthquake, can be fit by a superposition of three creep terms, each of the form ??l loge(1 + t/??l), with characteristic times ??, = 4.06, 0.11, and 0.0001 days. It seems likely that what is actually involved is a broad spectrum of characteristic times, the individual components of which arise from afterslip on different fault patches. Perfettini and Avouac (2004) have shown that an individual creep term can be explained by the spring-slider model with rate-dependent (no state variable) friction. The observed temporal function can also be explained using a single spring-slider model (i.e., single fault patch) that includes rate-and-state-dependent friction, a single-state variable, and either of the two commonly used (aging and slip) state evolution laws. In the latter fits, the rate-and-state friction parameter b is negative.

  4. Transient stresses al Parkfield, California, produced by the M 7.4 Landers earthquake of June 28, 1992: implications for the time-dependence of fault friction

    Directory of Open Access Journals (Sweden)

    J. B. Fletcher

    1994-06-01

    Full Text Available he M 7.4 Landers earthquake triggered widespread seismicity in the Western U.S. Because the transient dynamic stresses induced at regional distances by the Landers surface waves are much larger than the expected static stresses, the magnitude and the characteristics of the dynamic stresses may bear upon the earthquake triggering mechanism. The Landers earthquake was recorded on the UPSAR array, a group of 14 triaxial accelerometers located within a 1-square-km region 10 km southwest of the town of Parkfield, California, 412 km northwest of the Landers epicenter. We used a standard geodetic inversion procedure to determine the surface strain and stress tensors as functions of time from the observed dynamic displacements. Peak dynamic strains and stresses at the Earth's surface are about 7 microstrain and 0.035 MPa, respectively, and they have a flat amplitude spectrum between 2 s and 15 s period. These stresses agree well with stresses predicted from a simple rule of thumb based upon the ground velocity spectrum observed at a single station. Peak stresses ranged from about 0.035 MPa at the surface to about 0.12 MPa between 2 and 14 km depth, with the sharp increase of stress away from the surface resulting from the rapid increase of rigidity with depth and from the influence of surface wave mode shapes. Comparison of Landers-induced static and dynamic stresses at the hypocenter of the Big Bear aftershock provides a clear example that faults are stronger on time scales of tens of seconds than on time scales of hours or longer.

  5. Characterization of the San Andreas Fault near Parkfield, California by fault-zone trapped waves

    Science.gov (United States)

    Li, Y.; Vidale, J.; Cochran, E.

    2003-04-01

    by M6 earthquake episode at Parkfield although it probably represents the accumulated wear from many previous great earthquakes and other kinematical processes. The width of low-velocity waveguide likely represents the damage extent in dynamic rupture, consistent with the scale of process zone size to rupture length as existing model predicted. The variation in velocity reduction along the fault zone indicates an inference of changes in on-fault stress, fine-scale fault geometry, and fluid content at depths. On the other hand, a less developed and narrower low-velocity waveguide is on the north strand that experienced minor breaks at surface in the 1966 M6 event probably due to energy partitioning, strong shaking and dynamic strain by the earthquake on the main fault.

  6. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas fault.

    Science.gov (United States)

    Shelly, David R

    2010-06-11

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between approximately 3 and approximately 6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  7. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas Fault

    Science.gov (United States)

    Shelly, David R.

    2010-01-01

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between ~3 and ~6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  8. 3D P and S Wave Velocity Structure and Tremor Locations in the Parkfield Region

    Science.gov (United States)

    Zeng, X.; Thurber, C. H.; Shelly, D. R.; Bennington, N. L.; Cochran, E. S.; Harrington, R. M.

    2014-12-01

    We have assembled a new dataset to refine the 3D seismic velocity model in the Parkfield region. The S arrivals from 184 earthquakes recorded by the Parkfield Experiment to Record MIcroseismicity and Tremor array (PERMIT) during 2010-2011 were picked by a new S wave picker, which is based on machine learning. 74 blasts have been assigned to four quarries, whose locations were identified with Google Earth. About 1000 P and S wave arrivals from these blasts at permanent seismic network were also incorporated. Low frequency earthquakes (LFEs) occurring within non-volcanic tremor (NVT) are valuable for improving the precision of NVT location and the seismic velocity model at greater depths. Based on previous work (Shelley and Hardebeck, 2010), waveforms of hundreds of LFEs in same family were stacked to improve signal qualify. In a previous study (McClement et al., 2013), stacked traces of more than 30 LFE families at the Parkfileld Array Seismic Observatory (PASO) have been picked. We expanded our work to include LFEs recorded by the PERMIT array. The time-frequency Phase Weight Stacking (tf-PWS) method was introduced to improve the stack quality, as direct stacking does not produce clear S-wave arrivals on the PERMIT stations. This technique uses the coherence of the instantaneous phase among the stacked signals to enhance the signal-to-noise ratio (SNR) of the stack. We found that it is extremely effective for picking LFE arrivals (Thurber et al., 2014). More than 500 P and about 1000 S arrivals from 58 LFE families were picked at the PERMIT and PASO arrays. Since the depths of LFEs are much deeper than earthquakes, we are able to extend model resolution to lower crustal depths. Both P and S wave velocity structure have been obtained with the tomoDD method. The result suggests that there is a low velocity zone (LVZ) in the lower crust and the location of the LVZ is consistent with the high conductivity zone beneath the southern segment of the Rinconada fault that

  9. Conductivity Structure of the San Andreas Fault, Parkfield, Revisited

    Science.gov (United States)

    Park, S. K.; Roberts, J. J.

    2003-12-01

    Laboratory measurements of samples of sedimentary rocks from the Parkfield syncline reveal resistivities as low as 1 ohm m when saturated with fluids comparable to those found in nearby wells. The syncline lies on the North American side of the San Andreas fault at Parkfield and plunges northwestward into the fault zone. A previous interpretation of a high resolution magnetotelluric profile across the San Andreas fault at Parkfield identified an anomalously conductive (1-3 ohm m) region just west of the fault and extending to depths of 3 km. These low resistivity rocks were inferred to be crushed rock in the fault zone that was saturated with brines. As an alternative to this interpretation, we suggest that this anomalous region is actually the Parkfield syncline and that the current trace of the San Andreas fault at Middle Mountain does not form the boundary between the Salinian block and the North American plate. Instead, that boundary is approximately 1 km west and collocated with current seismicity. This work was performed under the auspices of the U.S. Department of Energy by the University of California Lawrence Livermore National Laboratory under contract W-7405-ENG-48 and supported specifically by the Office of Basic Energy Science. Additional support was provided by the U.S. Geological Survey (USGS), Department of the Interior, under USGS Award number 03HQGR0041. The views and conclusions contained in this document are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of the U.S. Government.

  10. Correlation of pre-earthquake electromagnetic signals with laboratory and field rock experiments

    Directory of Open Access Journals (Sweden)

    T. Bleier

    2010-09-01

    rock stressing results and the 30 October 2007 M5.4 Alum Rock earthquake field data.

    The second part of this paper examined other California earthquakes, prior to the Alum Rock earthquake, to see if magnetic pulsations were also present prior to those events. A search for field examples of medium earthquakes was performed to identify earthquakes where functioning magnetometers were present within 20 km, the expected detection range of the magnetometers. Two earthquakes identified in the search included the 12 August 1998 M5.1 San Juan Bautista (Hollister Ca. earthquake and the 28 September 2004 M6.0 Parkfield Ca. earthquake. Both of these data sets were recorded using EMI Corp. Model BF4 induction magnetometers, installed in equipment owned and operated by UC Berkeley. Unfortunately, no air conductivity or IR data were available for these earthquake examples. This new analysis of old data used the raw time series data (40 samples per s, and examined the data for short duration pulsations that exceeded the normal background noise levels at each site, similar to the technique used at Alum Rock. Analysis of Hollister magnetometer, positioned 2 km from the epicenter, showed a significant increase in magnetic pulsations above quiescient threshold levels several weeks prior, and especially 2 days prior to the quake. The pattern of positive and negative pulsations observed at Hollister, were similar, but not identical to Alum Rock in that the pattern of pulsations were interspersed with Pc 1 pulsation trains, and did not start 2 weeks prior to the quake, but rather 2 days prior. The Parkfield data (magnetometer positioned 19 km from the epicenter showed much smaller pre-earthquake pulsations, but the area had significantly higher conductivity (which attenuates the signals. More interesting was the fact that significant pulsations occurred between the aftershock sequences of quakes as the crustal stress patterns were migrating.

    Comparing

  11. Detection of aseismic creep along the San Andreas fault near Parkfield, California with ERS-1 radar interferometry

    Science.gov (United States)

    Werner, Charles L.; Rosen, Paul; Hensley, Scott; Fielding, Eric; Buckley, Sean

    1997-01-01

    The differential interferometric analysis of ERS data from Parkfield (CA) observations revealed the wide area distribution of creep along the moving fault segment of the San Andreas fault over a 15 month interval. The removal of the interferometric phase related to the surface topography was carried out. The fault was clearly visible in the differential interferogram. The magnitude of the tropospheric water vapor phase distortions is greater than the signal and hinders quantitative analysis beyond order of magnitude calculations.

  12. The frequency dependence of friction in experiment, theory, and observations of low frequency earthquakes

    Science.gov (United States)

    Thomas, A.; Beeler, N. M.; Burgmann, R.; Shelly, D. R.

    2011-12-01

    Low frequency earthquakes (LFEs) are small amplitude, short duration events composing tectonic tremor, probably generated by shear slip on asperities downdip of the seismogenic zone. In Parkfield, Shelly and Hardebeck [2010] have identified 88 LFE families, or hypocentral locations, that contain over half a million LFEs since 2001 on a 160-km-long section of the San Andreas fault between 16 and 30 km depth. A number of studies have demonstrated the extreme sensitivity of low frequency earthquakes (LFEs) near Parkfield to stress changes ranging from contingent upon the amplitude and frequency content of the applied stress. We attempt to test this framework by comparing observations of LFEs triggered in response to stresses spanning several orders of magnitude in both frequency and amplitude (e.g. tides, teleseismic surface waves, static stress changes, etc.) to the predicted response of a single degree of freedom slider block model with rate and state dependent strength. The sensitivity of failure time in the friction model as developed in previous studies does not distinguish between shear and normal stresses; laboratory experiments show a more complicated sensitivity of failure time to normal stress change than in the published model. Because the shear and normal tidal stresses at Parkfield have different amplitudes and are not in phase, we have modified the model to include the expected sensitivity to normal stress. Our prior investigations of the response of both regular and low frequency earthquakes to tidal stresses [Thomas et al., 2009; Shelly and Johnson, 2011] are qualitatively consistent with the predictions of the friction model , as both the timing and degree (probability) of correlation are in agreement.

  13. Time-lapse imaging of fault properties at seismogenic depth using repeating earthquakes, active sources and seismic ambient noise

    Science.gov (United States)

    Cheng, Xin

    2009-12-01

    The time-varying stress field of fault systems at seismogenic depths plays the mort important role in controlling the sequencing and nucleation of seismic events. Using seismic observations from repeating earthquakes, controlled active sources and seismic ambient noise, five studies at four different fault systems across North America, Central Japan, North and mid-West China are presented to describe our efforts to measure such time dependent structural properties. Repeating and similar earthquakes are hunted and analyzed to study the post-seismic fault relaxation at the aftershock zone of the 1984 M 6.8 western Nagano and the 1976 M 7.8 Tangshan earthquakes. The lack of observed repeating earthquakes at western Nagano is attributed to the absence of a well developed weak fault zone, suggesting that the fault damage zone has been almost completely healed. In contrast, the high percentage of similar and repeating events found at Tangshan suggest the existence of mature fault zones characterized by stable creep under steady tectonic loading. At the Parkfield region of the San Andreas Fault, repeating earthquake clusters and chemical explosions are used to construct a scatterer migration image based on the observation of systematic temporal variations in the seismic waveforms across the occurrence time of the 2004 M 6 Parkfield earthquake. Coseismic fluid charge or discharge in fractures caused by the Parkfield earthquake is used to explain the observed seismic scattering properties change at depth. In the same region, a controlled source cross-well experiment conducted at SAFOD pilot and main holes documents two large excursions in the travel time required for a shear wave to travel through the rock along a fixed pathway shortly before two rupture events, suggesting that they may be related to pre-rupture stress induced changes in crack properties. At central China, a tomographic inversion based on the theory of seismic ambient noise and coda wave interferometry

  14. Simulations of tremor-related creep reveal a weak crustal root of the San Andreas Fault

    Science.gov (United States)

    Shelly, David R.; Bradley, Andrew M.; Johnson, Kaj M.

    2013-01-01

    Deep aseismic roots of faults play a critical role in transferring tectonic loads to shallower, brittle crustal faults that rupture in large earthquakes. Yet, until the recent discovery of deep tremor and creep, direct inference of the physical properties of lower-crustal fault roots has remained elusive. Observations of tremor near Parkfield, CA provide the first evidence for present-day localized slip on the deep extension of the San Andreas Fault and triggered transient creep events. We develop numerical simulations of fault slip to show that the spatiotemporal evolution of triggered tremor near Parkfield is consistent with triggered fault creep governed by laboratory-derived friction laws between depths of 20–35 km on the fault. Simulated creep and observed tremor northwest of Parkfield nearly ceased for 20–30 days in response to small coseismic stress changes of order 104 Pa from the 2003 M6.5 San Simeon Earthquake. Simulated afterslip and observed tremor following the 2004 M6.0 Parkfield earthquake show a coseismically induced pulse of rapid creep and tremor lasting for 1 day followed by a longer 30 day period of sustained accelerated rates due to propagation of shallow afterslip into the lower crust. These creep responses require very low effective normal stress of ~1 MPa on the deep San Andreas Fault and near-neutral-stability frictional properties expected for gabbroic lower-crustal rock.

  15. Joint inversion for Vp, Vs, and Vp/Vs at SAFOD, Parkfield, California

    Science.gov (United States)

    Zhang, H.; Thurber, C.; Bedrosian, P.

    2009-01-01

    We refined the three-dimensional (3-D) Vp, Vs and Vp/Vs models around the San Andreas Fault Observatory at Depth (SAFOD) site using a new double-difference (DD) seismic tomography code (tomoDDPS) that simultaneously solves for earthquake locations and all three velocity models using both absolute and differential P, S, and S-P times. This new method is able to provide a more robust Vp/Vs model than that from the original DD tomography code (tomoDD), obtained simply by dividing Vp by Vs. For the new inversion, waveform cross-correlation times for earthquakes from 2001 to 2002 were also used, in addition to arrival times from earthquakes and explosions in the region. The Vp values extracted from the model along the SAFOD trajectory match well with the borehole log data, providing in situ confirmation of our results. Similar to previous tomographic studies, the 3-D structure around Parkfield is dominated by the velocity contrast across the San Andreas Fault (SAF). In both the Vp and Vs models, there is a clear low-velocity zone as deep as 7 km along the SAF trace, compatible with the findings from fault zone guided waves. There is a high Vp/Vs anomaly zone on the southwest side of the SAF trace that is about 1-2 km wide and extends as deep as 4 km, which is interpreted to be due to fluids and fractures in the package of sedimentary rocks abutting the Salinian basement rock to the southwest. The relocated earthquakes align beneath the northeast edge of this high Vp/Vs zone. We carried out a 2-D correlation analysis for an existing resistivity model and the corresponding profiles through our model, yielding a classification that distinguishes several major lithologies. ?? 2009 by the American Geophysical Union.

  16. Foreshocks and aftershocks of the Great 1857 California earthquake

    Science.gov (United States)

    Meltzner, A.J.; Wald, D.J.

    1999-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults anywhere in the world, yet we know little about many aspects of its behavior before, during, and after large earthquakes. We conducted a study to locate and to estimate magnitudes for the largest foreshocks and aftershocks of the 1857 M 7.9 Fort Tejon earthquake on the central and southern segments of the fault. We began by searching archived first-hand accounts from 1857 through 1862, by grouping felt reports temporally, and by assigning modified Mercalli intensities to each site. We then used a modified form of the grid-search algorithm of Bakum and Wentworth, derived from empirical analysis of modern earthquakes, to find the location and magnitude most consistent with the assigned intensities for each of the largest events. The result confirms a conclusion of Sieh that at least two foreshocks ('dawn' and 'sunrise') located on or near the Parkfield segment of the San Andreas fault preceded the mainshock. We estimate their magnitudes to be M ~ 6.1 and M ~ 5.6, respectively. The aftershock rate was below average but within one standard deviation of the number of aftershocks expected based on statistics of modern southern California mainshock-aftershock sequences. The aftershocks included two significant events during the first eight days of the sequence, with magnitudes M ~ 6.25 and M ~ 6.7, near the southern half of the rupture; later aftershocks included a M ~ 6 event near San Bernardino in December 1858 and a M ~ 6.3 event near the Parkfield segment in April 1860. From earthquake logs at Fort Tejon, we conclude that the aftershock sequence lasted a minimum of 3.75 years.

  17. Three-component ambient noise beamforming in the Parkfield area

    Science.gov (United States)

    Löer, Katrin; Riahi, Nima; Saenger, Erik H.

    2018-06-01

    We apply a three-component beamforming algorithm to an ambient noise data set recorded at a seismic array to extract information about both isotropic and anisotropic surface wave velocities. In particular, we test the sensitivity of the method with respect to the array geometry as well as to seasonal variations in the distribution of noise sources. In the earth's crust, anisotropy is typically caused by oriented faults or fractures and can be altered when earthquakes or human activities cause these structures to change. Monitoring anisotropy changes thus provides time-dependent information on subsurface processes, provided they can be distinguished from other effects. We analyse ambient noise data at frequencies between 0.08 and 0.52 Hz recorded at a three-component array in the Parkfield area, California (US), between 2001 November and 2002 April. During this time, no major earthquakes were identified in the area and structural changes are thus not expected. We compute dispersion curves of Love and Rayleigh waves and estimate anisotropy parameters for Love waves. For Rayleigh waves, the azimuthal source coverage is too limited to perform anisotropy analysis. For Love waves, ambient noise sources are more widely distributed and we observe significant and stable surface wave anisotropy for frequencies between 0.2 and 0.4 Hz. Synthetic data experiments indicate that the array geometry introduces apparent anisotropy, especially when waves from multiple sources arrive simultaneously at the array. Both the magnitude and the pattern of apparent anisotropy, however, differ significantly from the anisotropy observed in Love wave data. Temporal variations of anisotropy parameters observed at frequencies below 0.2 Hz and above 0.4 Hz correlate with changes in the source distribution. Frequencies between 0.2 and 0.4 Hz, however, are less affected by these variations and provide relatively stable results over the period of study.

  18. Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California

    Science.gov (United States)

    Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

    2004-12-01

    In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and

  19. Along-strike variations in fault frictional properties along the San Andreas Fault near Cholame, California from joint earthquake and low-frequency earthquake relocations

    Science.gov (United States)

    Harrington, Rebecca M.; Cochran, Elizabeth S.; Griffiths, Emily M.; Zeng, Xiangfang; Thurber, Clifford H.

    2016-01-01

    Recent observations of low‐frequency earthquakes (LFEs) and tectonic tremor along the Parkfield–Cholame segment of the San Andreas fault suggest slow‐slip earthquakes occur in a transition zone between the shallow fault, which accommodates slip by a combination of aseismic creep and earthquakes (fault, which accommodates slip by stable sliding (>35  km depth). However, the spatial relationship between shallow earthquakes and LFEs remains unclear. Here, we present precise relocations of 34 earthquakes and 34 LFEs recorded during a temporary deployment of 13 broadband seismic stations from May 2010 to July 2011. We use the temporary array waveform data, along with data from permanent seismic stations and a new high‐resolution 3D velocity model, to illuminate the fine‐scale details of the seismicity distribution near Cholame and the relation to the distribution of LFEs. The depth of the boundary between earthquakes and LFE hypocenters changes along strike and roughly follows the 350°C isotherm, suggesting frictional behavior may be, in part, thermally controlled. We observe no overlap in the depth of earthquakes and LFEs, with an ∼5  km separation between the deepest earthquakes and shallowest LFEs. In addition, clustering in the relocated seismicity near the 2004 Mw 6.0 Parkfield earthquake hypocenter and near the northern boundary of the 1857 Mw 7.8 Fort Tejon rupture may highlight areas of frictional heterogeneities on the fault where earthquakes tend to nucleate.

  20. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  1. Improving attenuation tomography by novel inversions for t* and Q: application to Parkfield, California and Okmok volcano, Alaska

    Science.gov (United States)

    Pesicek, J. D.; Bennington, N. L.; Thurber, C. H.; Zhang, H.

    2011-12-01

    Standard methods for mapping variations in seismic attenuation (Q) structure using local earthquake data involve a two-step process. First, values of the whole path attenuation operator t* are determined from earthquake data recorded on a seismic array by inverting observed spectra for source and attenuation parameters. Then, these t* data are used to invert tomographically for frequency-independent Q models. The observed earthquake amplitude spectra depend on both source parameters and site effects. However, quantification of site effects is often neglected. Bennington et al. [2008] determined site response jointly with source parameters for small groups of events and then computed each station's site response as the average over all groups. Building on this work, we have adapted the method to model all events simultaneously to more accurately determine site response from the earthquake spectra. This in turn allows us to more accurately determine t*. However, resolution analysis of the results shows that some parameters are not well resolved in the joint inversion. Thus, an alternating inversion scheme is tested and adapted to alleviate poor resolution and parameter trade-offs. The new scheme produces better fits to the earthquake spectra than previous methods, and the resulting t* data should allow for more accurate determination of the Q structure. Because the equation relating t* to Q is nonlinear, the typical approach to determining Q is to solve for changes to a starting model iteratively, similar to methods commonly used in travel time tomography. However, if we instead solve for the inverse of Q, the equation becomes linear and the solution no longer depends on the starting model. This simple modification may allow us to more accurately determine Q. We test these new t* and Q methods using earthquake data from Parkfield, California and Okmok volcano, Alaska. We present the results for real and synthetic data and compare and contrast these results to more

  2. Constraining earthquake source inversions with GPS data: 1. Resolution-based removal of artifacts

    Science.gov (United States)

    Page, M.T.; Custodio, S.; Archuleta, R.J.; Carlson, J.M.

    2009-01-01

    We present a resolution analysis of an inversion of GPS data from the 2004 Mw 6.0 Parkfield earthquake. This earthquake was recorded at thirteen 1-Hz GPS receivers, which provides for a truly coseismic data set that can be used to infer the static slip field. We find that the resolution of our inverted slip model is poor at depth and near the edges of the modeled fault plane that are far from GPS receivers. The spatial heterogeneity of the model resolution in the static field inversion leads to artifacts in poorly resolved areas of the fault plane. These artifacts look qualitatively similar to asperities commonly seen in the final slip models of earthquake source inversions, but in this inversion they are caused by a surplus of free parameters. The location of the artifacts depends on the station geometry and the assumed velocity structure. We demonstrate that a nonuniform gridding of model parameters on the fault can remove these artifacts from the inversion. We generate a nonuniform grid with a grid spacing that matches the local resolution length on the fault and show that it outperforms uniform grids, which either generate spurious structure in poorly resolved regions or lose recoverable information in well-resolved areas of the fault. In a synthetic test, the nonuniform grid correctly averages slip in poorly resolved areas of the fault while recovering small-scale structure near the surface. Finally, we present an inversion of the Parkfield GPS data set on the nonuniform grid and analyze the errors in the final model. Copyright 2009 by the American Geophysical Union.

  3. Constraints on Dynamic Triggering from very Short term Microearthquake Aftershocks at Parkfield

    Science.gov (United States)

    Ampuero, J.; Rubin, A.

    2004-12-01

    The study of microearthquakes helps bridge the gap between laboratory experiments and data from large earthquakes, the two disparate scales that have contributed so far to our understanding of earthquake physics. Although they are frequent, microearthquakes are difficult to analyse. Applying high precision relocation techniques, Rubin and Gillard (2000) observed a pronounced asymmetry in the spatial distribution of the earliest and nearest aftershocks of microearthquakes along the San Andreas fault (they occur more often to the NW of the mainshock). It was suggested that this could be related to the velocity contrast across the fault. Preferred directivity of dynamic rupture pulses running along a bimaterial interface (to the SE in the case of the SAF) is expected on theoretical grounds. Our numerical simulations of crack-like rupture on such interfaces show a pronounced asymmetry of the stress histories beyond the rupture ends, and suggest two possible mechanisms for the observed asymmetry: First, that it results from an asymmmetry in the static stress field following arrest of the mainshock (closer to failure to the NW), or second, that it is due to a short-duration tensile pulse that propagates to the SE, which could reduce the number of aftershocks to the SE by dynamic triggering of any nucleation site close enough to failure to have otherwise produced an aftershock. To distinguish betwen these mechanisms we need observations of dynamic triggering in microseismicity. For small events triggered at a distance of some mainshock radii, triggering time scales are so short that seismograms of both events overlap. To detect the occurrence of compound events and very short term aftershocks in the HRSN Parkfield archived waveforms we have developed an automated search algorithm based on empirical Green's function (EGF) deconvolution. Optimal EGFs are first selected by the coherency of the cross-component convolution with respect to the target event. Then Landweber

  4. Generalized Free-Surface Effect and Random Vibration Theory: a new tool for computing moment magnitudes of small earthquakes using borehole data

    Science.gov (United States)

    Malagnini, Luca; Dreger, Douglas S.

    2016-07-01

    Although optimal, computing the moment tensor solution is not always a viable option for the calculation of the size of an earthquake, especially for small events (say, below Mw 2.0). Here we show an alternative approach to the calculation of the moment-rate spectra of small earthquakes, and thus of their scalar moments, that uses a network-based calibration of crustal wave propagation. The method works best when applied to a relatively small crustal volume containing both the seismic sources and the recording sites. In this study we present the calibration of the crustal volume monitored by the High-Resolution Seismic Network (HRSN), along the San Andreas Fault (SAF) at Parkfield. After the quantification of the attenuation parameters within the crustal volume under investigation, we proceed to the spectral correction of the observed Fourier amplitude spectra for the 100 largest events in our data set. Multiple estimates of seismic moment for the all events (1811 events total) are obtained by calculating the ratio of rms-averaged spectral quantities based on the peak values of the ground velocity in the time domain, as they are observed in narrowband-filtered time-series. The mathematical operations allowing the described spectral ratios are obtained from Random Vibration Theory (RVT). Due to the optimal conditions of the HRSN, in terms of signal-to-noise ratios, our network-based calibration allows the accurate calculation of seismic moments down to Mw < 0. However, because the HRSN is equipped only with borehole instruments, we define a frequency-dependent Generalized Free-Surface Effect (GFSE), to be used instead of the usual free-surface constant F = 2. Our spectral corrections at Parkfield need a different GFSE for each side of the SAF, which can be quantified by means of the analysis of synthetic seismograms. The importance of the GFSE of borehole instruments increases for decreasing earthquake's size because for smaller earthquakes the bandwidth available

  5. Seismic trapped modes in the oroville and san andreas fault zones.

    Science.gov (United States)

    Li, Y G; Leary, P; Aki, K; Malin, P

    1990-08-17

    Three-component borehole seismic profiling of the recently active Oroville, California, normal fault and microearthquake event recording with a near-fault three-component borehole seismometer on the San Andreas fault at Parkfield, California, have shown numerous instances of pronounced dispersive wave trains following the shear wave arrivals. These wave trains are interpreted as fault zone-trapped seismic modes. Parkfield earthquakes exciting trapped modes have been located as deep as 10 kilometers, as shallow as 4 kilometers, and extend 12 kilometers along the fault on either side of the recording station. Selected Oroville and Parkfield wave forms are modeled as the fundamental and first higher trapped SH modes of a narrow low-velocity layer at the fault. Modeling results suggest that the Oroville fault zone is 18 meters wide at depth and has a shear wave velocity of 1 kilometer per second, whereas at Parkfield, the fault gouge is 100 to 150 meters wide and has a shear wave velocity of 1.1 to 1.8 kilometers per second. These low-velocity layers are probably the rupture planes on which earthquakes occur.

  6. Multistation magnetotellurics. Final report, 1 January 1996--30 June 1997

    Energy Technology Data Exchange (ETDEWEB)

    Egbert, G.D.

    1997-12-31

    The author has developed the foundations of a practical multivariate approach to processing magnetotelluric array data. Compared to current standards for magnetotelluric data processing, the multivariate approach is unique in that all available data channels are used simultaneously. The approach is outlined in this report. Using Multmtrn, a program for multiple station analysis of magnetotelluric data, the author achieved significant improvements in apparent resistivity and phase estimates in initial tests. Examples of the use of this approach are given including: Carrizo Plain and Parkfield electromagnetic profiling data; sea floor magnetotelluric (MT) data from the Gulf of Mexico; MT survey in a culturally noisy area of Bavaria; and Parkfield/Hollister earthquake monitoring array data. Experience with these projects has resulted in an improved program. The new version of the code is available at http://www.cg.NRCan.gc.ca/mtnet/mtnet.html or by contacting egbert{at}oce.orst.edu. Appendices of this report present documentation for Multmtrn.

  7. Effects of methods of attenuation correction on source parameter determination

    Science.gov (United States)

    Sonley, Eleanor; Abercrombie, Rachel E.

    We quantify the effects of using different approaches to model individual earthquake spectra. Applying different approaches can introduce significant variability in the calculated source parameters, even when applied to the same data. To compare large and small earthquake source parameters, the results of multiple studies need to be combined to extend the magnitude range, but the variability introduced by the different approaches hampers the outcome. When studies are combined, there is large uncertainty and large scatter and some systematic differences have been neglected. We model individual earthquake spectra from repeating earthquakes (M˜2) at Parkfield, CA, recorded by a borehole network. We focus on the effects of trade-offs between attenuation (Q) and corner frequency in spectral fitting and the effect of the model shape at the corner frequency on radiated energy. The trade-off between attenuation and corner frequency can increase radiated energy by up to 400% and seismic moment by up to 100%.

  8. Relating seismicity to the velocity structure of the San Andreas Fault near Parkfield, CA

    Science.gov (United States)

    Lippoldt, Rachel; Porritt, Robert W.; Sammis, Charles G.

    2017-06-01

    The central section of the San Andreas Fault (SAF) displays a range of seismic phenomena including normal earthquakes, low-frequency earthquakes (LFE), repeating microearthquakes (REQ) and aseismic creep. Although many lines of evidence suggest that LFEs are tied to the presence of fluids, their geological setting is still poorly understood. Here, we map the seismic velocity structures associated with LFEs beneath the central SAF using surface wave tomography from ambient seismic noise to provide constraints on the physical conditions that control LFE occurrence. Fault perpendicular sections show that the SAF, as revealed by lateral contrasts in relative velocities, is contiguous to depths of 50 km and appears to be relatively localized at depths between about 15 and 30 km. This is consistent with the hypothesis that LFEs are shear-slip events on a deep extension of the SAF. We find that along strike variations in seismic behaviour correspond to changes in the seismic structure, which support proposed connections between fluids and seismicity. LFEs and REQs occur within low-velocity structures, suggesting that the presence of fluids, weaker minerals, or hydrous phase minerals may play an important role in the generation of slow-slip phenomena.

  9. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  10. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  11. Tidal Sensitivity of Declustered Low Frequency Earthquake Families and Inferred Creep Episodes on the San Andreas Fault

    Science.gov (United States)

    Babb, A.; Thomas, A.; Bletery, Q.

    2017-12-01

    Low frequency earthquakes (LFEs) are detected at depths of 16-30 km on a 150 km section of the San Andreas Fault centered at Parkfield, CA. The LFEs are divided into 88 families based on waveform similarity. Each family is thought to represent a brittle asperity on the fault surface that repeatedly slips during aseismic slip of the surrounding fault. LFE occurrence is irregular which allows families to be divided into continuous and episodic. In continuous families a burst of a few LFE events recurs every few days while episodic families experience essentially quiescent periods often lasting months followed by bursts of hundreds of events over a few days. The occurrence of LFEs has also been shown to be sensitive to extremely small ( 1kPa) tidal stress perturbations. However, the clustered nature of LFE occurrence could potentially bias estimates of tidal sensitivity. Here we re-evaluate the tidal sensitivity of LFE families on the deep San Andreas using a declustered catalog. In this catalog LFE bursts are isolated based on the recurrence intervals between individual LFE events for each family. Preliminary analysis suggests that declustered LFE families are still highly sensitive to tidal stress perturbations, primarily right-lateral shear stress (RLSS) and to a lesser extent fault normal stress (FNS). We also find inferred creep episodes initiate preferentially during times of positive RLSS.

  12. Transient Aseismic Slip in the Cascadia Subduction Zone: From Monitoring to Useful Real-time Hazards Information

    Science.gov (United States)

    Roeloffs, E. A.; Beeler, N. M.

    2010-12-01

    The Cascadia subduction zone, extending from northern California to Vancouver Island, has a 10,000 year record of earthquakes > M8.5 at intervals of several hundred years, with the last major event (~M9) in 1700. Agencies in CA, OR, WA, and BC are raising public awareness of the hazards posed by a repeat Cascadia earthquake and its ensuing tsunami. Because most of the subduction interface is now seismically quiet, an interface event M6 or larger would generate intense public concern that it could be a potential foreshock of a great earthquake. Cascadia residents are also interested in the episodic tremor and slip (ETS) events that recur months to years apart: strong evidence implies these aseismic events represent accelerated interface slip downdip of the seismogenic zone. Simple mechanics implies ETS events temporarily increase the stressing rate on the locked zone. ETS events in northern Cascadia recur at fairly regular intervals and produced roughly similar patterns of deformation. However, an unusually large ETS event or increased interface seismicity would certainly prompt public officials and local residents to expect scientists to quickly determine the implications for a major Cascadia earthquake. Earthquake scientists generally agree that such “situations of concern” warrant close monitoring, but attempts to quantify potential probability changes are in very early stages. With >30 borehole strainmeters and >100 GPS stations of the NSF-funded Plate Boundary Observatory (PBO) in Cascadia, geodesists must develop a well-organized real-time monitoring scheme for interpreting aseismic deformation, with an accompanying public communication strategy. Two previously-exercised monitoring and communication protocols could be adapted for Cascadia. During the Parkfield, California, Earthquake Experiment, geodetic signals were assigned alert levels based on their rareness in the past record, on confirmation by more than one instrument, and on consistency with

  13. Relaxation of the south flank after the 7.2-magnitude Kalapana earthquake, Kilauea Volcano, Hawaii

    Science.gov (United States)

    Dvorak, John J.; Klein, Fred W.; Swanson, Donald A.

    1994-01-01

    An M = 7.2 earthquake on 29 November 1975 caused the south flank of Kilauea Volcano, Hawaii, to move seaward several meters: a catastrophic release of compression of the south flank caused by earlier injections of magma into the adjacent segment of a rift zone. The focal mechanisms of the mainshock, the largest foreshock, and the largest aftershock suggest seaward movement of the upper block. The rate of aftershocks decreased in a familiar hyperbolic decay, reaching the pre-1975 rate of seismicity by the mid-1980s. Repeated rift-zone intrusions and eruptions after 1975, which occurred within 25 km of the summit area, compressed the adjacent portion of the south flank, apparently masking continued seaward displacement of the south flank. This is evident along a trilateration line that continued to extend, suggesting seaward displacement, immediately after the M = 7.2 earthquake, but then was compressed during a series of intrusions and eruptions that began in September 1977. Farther to the east, trilateration measurements show that the portion of the south flank above the aftershock zone, but beyond the area of compression caused by the rift-zone intrusions and eruptions, continued to move seaward at a decreasing rate until the mid-1980s, mimicking the decay in aftershock rate. Along the same portion of the south flank, the pattern of vertical surface displacements can be explained by continued seaward movement of the south flank and development of two eruptive fissures along the east rift zone, each of which extended from a depth of ∼3 km to the surface. The aftershock rate and continued seaward movement of the south flank are reminiscent of crustal response to other large earthquakes, such as the 1966 M = 6 Parkfield earthquake and the 1983 M = 6.5 Coalinga earthquake.

  14. A look inside the San Andreas Fault at Parkfield through vertical seismic profiling.

    Science.gov (United States)

    Chavarria, J Andres; Malin, Peter; Catchings, Rufus D; Shalev, Eylon

    2003-12-05

    The San Andreas Fault Observatory at Depth pilot hole is located on the southwestern side of the Parkfield San Andreas fault. This observatory includes a vertical seismic profiling (VSP) array. VSP seismograms from nearby microearthquakes contain signals between the P and S waves. These signals may be P and S waves scattered by the local geologic structure. The collected scattering points form planar surfaces that we interpret as the San Andreas fault and four other secondary faults. The scattering process includes conversions between P and S waves, the strengths of which suggest large contrasts in material properties, possibly indicating the presence of cracks or fluids.

  15. Locating Very-Low-Frequency Earthquakes in the San Andreas Fault.

    Science.gov (United States)

    Peña-Castro, A. F.; Harrington, R. M.; Cochran, E. S.

    2016-12-01

    The portion of tectonic fault where rheological properties transtition from brittle to ductile hosts a variety of seismic signals suggesting a range of slip velocities. In subduction zones, the two dominantly observed seismic signals include very-low frequency earthquakes ( VLFEs), and low-frequency earthquakes (LFEs) or tectonic tremor. Tremor and LFE are also commonly observed in transform faults, however, VLFEs have been reported dominantly in subduction zone environments. Here we show some of the first known observations of VLFEs occurring on a plate boundary transform fault, the San Andreas Fault (SAF) between the Cholame-Parkfield segment in California. We detect VLFEs using both permanent and temporary stations in 2010-2011 within approximately 70 km of Cholame, California. We search continous waveforms filtered from 0.02-0.05 Hz, and remove time windows containing teleseismic events and local earthquakes, as identified in the global Centroid Moment Tensor (CMT) and the Northern California Seismic Network (NCSN) catalog. We estimate the VLFE locations by converting the signal into envelopes, and cross-correlating them for phase-picking, similar to procedures used for locating tectonic tremor. We first perform epicentral location using a grid search method and estimate a hypocenter location using Hypoinverse and a shear-wave velocity model when the epicenter is located close to the SAF trace. We account for the velocity contrast across the fault using separate 1D velocity models for stations on each side. Estimated hypocentral VLFE depths are similar to tremor catalog depths ( 15-30 km). Only a few VLFEs produced robust hypocentral locations, presumably due to the difficulty in picking accurate phase arrivals with such a low-frequency signal. However, for events for which no location could be obtained, the moveout of phase arrivals across the stations were similar in character, suggesting that other observed VLFEs occurred in close proximity.

  16. Investigation of ULF magnetic pulsations, air conductivity changes, and infra red signatures associated with the 30 October Alum Rock M5.4 earthquake

    Directory of Open Access Journals (Sweden)

    T. Bleier

    2009-04-01

    Full Text Available Several electromagnetic signal types were observed prior to and immediately after 30 October 2007 (Local Time M5.4 earthquake at Alum Rock, Ca with an epicenter ~15 km NE of San Jose Ca. The area where this event occurred had been monitored since November 2005 by a QuakeFinder magnetometer site, unit 609, 2 km from the epicenter. This instrument is one of 53 stations of the QuakeFinder (QF California Magnetometer Network-CalMagNet. This station included an ultra low frequency (ULF 3-axis induction magnetometer, a simple air conductivity sensor to measure relative airborne ion concentrations, and a geophone to identify the arrival of the P-wave from an earthquake. Similar in frequency content to the increased ULF activity reported two weeks prior to the Loma Prieta M7.0 quake in 1989 (Fraser-Smith, 1990, 1991, the QF station detected activity in the 0.01–12 Hz bands, but it consisted of an increasing number of short duration (1 to 30 s duration pulsations. The pulsations peaked around 13 days prior to the event. The amplitudes of the pulses were strong, (3–20 nT, compared to the average ambient noise at the site, (10–250 pT, which included a component arising from the Bay Area Rapid Transit (BART operations. The QF station also detected different pulse shapes, e.g. negative or positive only polarity, with some pulses including a combination of positive and negative. Typical pulse counts over the previous year ranged from 0–15 per day, while the count rose to 176 (east-west channel on 17 October, 13 days prior to the quake. The air conductivity sensor saturated for over 14 h during the night and morning prior to the quake, which occurred at 20:29 LT. Anomalous IR signatures were also observed in the general area, within 50 km of the epicenter, during the 2 weeks prior to the quake. These three simultaneous EM phenomena were compared with data collected over a 1–2-year period at the site. The data was also compared against accounts of air

  17. Characterizing the structural maturity of fault zones using high-resolution earthquake locations.

    Science.gov (United States)

    Perrin, C.; Waldhauser, F.; Scholz, C. H.

    2017-12-01

    We use high-resolution earthquake locations to characterize the three-dimensional structure of active faults in California and how it evolves with fault structural maturity. We investigate the distribution of aftershocks of several recent large earthquakes that occurred on immature faults (i.e., slow moving and small cumulative displacement), such as the 1992 (Mw7.3) Landers and 1999 (Mw7.1) Hector Mine events, and earthquakes that occurred on mature faults, such as the 1984 (Mw6.2) Morgan Hill and 2004 (Mw6.0) Parkfield events. Unlike previous studies which typically estimated the width of fault zones from the distribution of earthquakes perpendicular to the surface fault trace, we resolve fault zone widths with respect to the 3D fault surface estimated from principal component analysis of local seismicity. We find that the zone of brittle deformation around the fault core is narrower along mature faults compared to immature faults. We observe a rapid fall off of the number of events at a distance range of 70 - 100 m from the main fault surface of mature faults (140-200 m fault zone width), and 200-300 m from the fault surface of immature faults (400-600 m fault zone width). These observations are in good agreement with fault zone widths estimated from guided waves trapped in low velocity damage zones. The total width of the active zone of deformation surrounding the main fault plane reach 1.2 km and 2-4 km for mature and immature faults, respectively. The wider zone of deformation presumably reflects the increased heterogeneity in the stress field along complex and discontinuous faults strands that make up immature faults. In contrast, narrower deformation zones tend to align with well-defined fault planes of mature faults where most of the deformation is concentrated. Our results are in line with previous studies suggesting that surface fault traces become smoother, and thus fault zones simpler, as cumulative fault slip increases.

  18. A modification scheme for seismic acceleration - time histories

    International Nuclear Information System (INIS)

    Bethell, J.

    1979-05-01

    A technique is described for the modification of recorded earthquake acceleration-time histories which gives reduced peak accelerations whilst leaving other significant characteristics unchanged. Such modifications are of use in constructing design basis acceleration-time histories such that all important parameters conform to a specified return period. The technique is applied to two recordings from the 1966 Parkfield earthquake, their peak accelerations being reduced in each case from about 40% g to 25% g. (author)

  19. Electromagnetic Imaging of Fluids in the San Andreas Fault; FINAL

    International Nuclear Information System (INIS)

    Martyn Unsworth

    2002-01-01

    OAK 270 - Magnetotelluric data were collected on six profiles across the san Andreas Fault at Cholame,Parkfield, and Hollister in Central California. On each profile, high electrical resistivities were imaged west of the fault, and are due to granitic rocks of the Salinian block. East of the fault, lower electrical resistivities are associated with rocks of the Fanciscan formation. On the seismically active Parkfield and Hollister segments, a region of low resistivity was found in the fault zone that extends to a depth of several kilometers. This is due to a zone of fracturing (the damaged zone) that has been infiltrated by saline ground water. The shallowest micro-earthquakers occur at a depth that is coincident with the base of the low resistivity wedge. This strongly suggests that above this depth, the fault rocks are too weak to accumulate sufficient stress for earthquake rupture to occur and fault motion is accommodated through aseismic creep

  20. Using an Earthquake Simulator to Model Tremor Along a Strike Slip Fault

    Science.gov (United States)

    Cochran, E. S.; Richards-Dinger, K. B.; Kroll, K.; Harrington, R. M.; Dieterich, J. H.

    2013-12-01

    We employ the earthquake simulator, RSQSim, to investigate the conditions under which tremor occurs in the transition zone of the San Andreas fault. RSQSim is a computationally efficient method that uses rate- and state- dependent friction to simulate a wide range of event sizes for long time histories of slip [Dieterich and Richards-Dinger, 2010; Richards-Dinger and Dieterich, 2012]. RSQSim has been previously used to investigate slow slip events in Cascadia [Colella et al., 2011; 2012]. Earthquakes, tremor, slow slip, and creep occurrence are primarily controlled by the rate and state constants a and b and slip speed. We will report the preliminary results of using RSQSim to vary fault frictional properties in order to better understand rupture dynamics in the transition zone using observed characteristics of tremor along the San Andreas fault. Recent studies of tremor along the San Andreas fault provide information on tremor characteristics including precise locations, peak amplitudes, duration of tremor episodes, and tremor migration. We use these observations to constrain numerical simulations that examine the slip conditions in the transition zone of the San Andreas Fault. Here, we use the earthquake simulator, RSQSim, to conduct multi-event simulations of tremor for a strike slip fault modeled on Cholame section of the San Andreas fault. Tremor was first observed on the San Andreas fault near Cholame, California near the southern edge of the 2004 Parkfield rupture [Nadeau and Dolenc, 2005]. Since then, tremor has been observed across a 150 km section of the San Andreas with depths between 16-28 km and peak amplitudes that vary by a factor of 7 [Shelly and Hardebeck, 2010]. Tremor episodes, comprised of multiple low frequency earthquakes (LFEs), tend to be relatively short, lasting tens of seconds to as long as 1-2 hours [Horstmann et al., in review, 2013]; tremor occurs regularly with some tremor observed almost daily [Shelly and Hardebeck, 2010; Horstmann

  1. Simulation of earthquakes with cellular automata

    Directory of Open Access Journals (Sweden)

    P. G. Akishin

    1998-01-01

    Full Text Available The relation between cellular automata (CA models of earthquakes and the Burridge–Knopoff (BK model is studied. It is shown that the CA proposed by P. Bak and C. Tang,although they have rather realistic power spectra, do not correspond to the BK model. We present a modification of the CA which establishes the correspondence with the BK model.An analytical method of studying the evolution of the BK-like CA is proposed. By this method a functional quadratic in stress release, which can be regarded as an analog of the event energy, is constructed. The distribution of seismic events with respect to this “energy” shows rather realistic behavior, even in two dimensions. Special attention is paid to two-dimensional automata; the physical restrictions on compression and shear stiffnesses are imposed.

  2. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  3. Evaluation of hypotheses for right-lateral displacement of Neogene strata along the San Andreas Fault between Parkfield and Maricopa, California

    Science.gov (United States)

    Stanley, Richard G.; Barron, John A.; Powell, Charles L.

    2017-12-22

    We used geological field studies and diatom biostratigraphy to test a published hypothesis that Neogene marine siliceous strata in the Maricopa and Parkfield areas, located on opposite sides of the San Andreas Fault, were formerly contiguous and then were displaced by about 80–130 kilometers (km) of right-lateral slip along the fault. In the Maricopa area on the northeast side of the San Andreas Fault, the upper Miocene Bitterwater Creek Shale consists of hard, siliceous shale with dolomitic concretions and turbidite sandstone interbeds. Diatom assemblages indicate that the Bitterwater Creek Shale was deposited about 8.0–6.7 million years before present (Ma) at the same time as the uppermost part of the Monterey Formation in parts of coastal California. In the Parkfield area on the southwest side of the San Andreas Fault, the upper Miocene Pancho Rico Formation consists of soft to indurated mudstone and siltstone and fossiliferous, bioturbated sandstone. Diatom assemblages from the Pancho Rico indicate deposition about 6.7–5.7 Ma (latest Miocene), younger than the Bitterwater Creek Shale and at about the same time as parts of the Sisquoc Formation and Purisima Formation in coastal California. Our results show that the Bitterwater Creek Shale and Pancho Rico Formation are lithologically unlike and of different ages and therefore do not constitute a cross-fault tie that can be used to estimate rightlateral displacement along the San Andreas Fault.In the Maricopa area northeast of the San Andreas Fault, the Bitterwater Creek Shale overlies conglomeratic fan-delta deposits of the upper Miocene Santa Margarita Formation, which in turn overlie siliceous shale of the Miocene Monterey Formation from which we obtained a diatom assemblage dated at about 10.0–9.3 Ma. Previous investigations noted that the Santa Margarita Formation in the Maricopa area contains granitic and metamorphic clasts derived from sources in the northern Gabilan Range, on the opposite side of

  4. Scientific drilling into the San Andreas fault and site characterization research: Planning and coordination efforts. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Zoback, M.D.

    1998-08-30

    The fundamental scientific issue addressed in this proposal, obtaining an improved understanding of the physical and chemical processes responsible for earthquakes along major fault zones, is clearly of global scientific interest. By sampling the San Andreas fault zone and making direct measurements of fault zone properties to 4.0 km at Parkfield they will be studying an active plate-boundary fault at a depth where aseismic creep and small earthquakes occur and where a number of the scientific questions associated with deeper fault zone drilling can begin to be addressed. Also, the technological challenges associated with drilling, coring, downhole measurements and borehole instrumentation that may eventually have to be faced in deeper drilling can first be addressed at moderate depth and temperature in the Parkfield hole. Throughout the planning process leading to the development of this proposal they have invited participation by scientists from around the world. As a result, the workshops and meetings they have held for this project have involved about 350 scientists and engineers from about a dozen countries.

  5. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  6. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  7. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  8. Integrated Program of Multidisciplinary Education and Research in Mechanics and Physics of Earthquakes

    Science.gov (United States)

    Lapusta, N.

    2011-12-01

    Studying earthquake source processes is a multidisciplinary endeavor involving a number of subjects, from geophysics to engineering. As a solid mechanician interested in understanding earthquakes through physics-based computational modeling and comparison with observations, I need to educate and attract students from diverse areas. My CAREER award has provided the crucial support for the initiation of this effort. Applying for the award made me to go through careful initial planning in consultation with my colleagues and administration from two divisions, an important component of the eventual success of my path to tenure. Then, the long-term support directed at my program as a whole - and not a specific year-long task or subject area - allowed for the flexibility required for a start-up of a multidisciplinary undertaking. My research is directed towards formulating realistic fault models that incorporate state-of-the-art experimental studies, field observations, and analytical models. The goal is to compare the model response - in terms of long-term fault behavior that includes both sequences of simulated earthquakes and aseismic phenomena - with observations, to identify appropriate constitutive laws and parameter ranges. CAREER funding has enabled my group to develop a sophisticated 3D modeling approach that we have used to understand patterns of seismic and aseismic fault slip on the Sunda megathrust in Sumatra, investigate the effect of variable hydraulic properties on fault behavior, with application to Chi-Chi and Tohoku earthquake, create a model of the Parkfield segment of the San Andreas fault that reproduces both long-term and short-term features of the M6 earthquake sequence there, and design experiments with laboratory earthquakes, among several other studies. A critical ingredient in this research program has been the fully integrated educational component that allowed me, on the one hand, to expose students from different backgrounds to the

  9. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  10. Wavelet-based blind identification of the UCLA Factor building using ambient and earthquake responses

    International Nuclear Information System (INIS)

    Hazra, B; Narasimhan, S

    2010-01-01

    Blind source separation using second-order blind identification (SOBI) has been successfully applied to the problem of output-only identification, popularly known as ambient system identification. In this paper, the basic principles of SOBI for the static mixtures case is extended using the stationary wavelet transform (SWT) in order to improve the separability of sources, thereby improving the quality of identification. Whereas SOBI operates on the covariance matrices constructed directly from measurements, the method presented in this paper, known as the wavelet-based modified cross-correlation method, operates on multiple covariance matrices constructed from the correlation of the responses. The SWT is selected because of its time-invariance property, which means that the transform of a time-shifted signal can be obtained as a shifted version of the transform of the original signal. This important property is exploited in the construction of several time-lagged covariance matrices. The issue of non-stationary sources is addressed through the formation of several time-shifted, windowed covariance matrices. Modal identification results are presented for the UCLA Factor building using ambient vibration data and for recorded responses from the Parkfield earthquake, and compared with published results for this building. Additionally, the effect of sensor density on the identification results is also investigated

  11. Examining and Comparing Earthquake Readiness in East San Francisco Bay Area Communities (Invited)

    Science.gov (United States)

    Ramirez, N.; Bul, V.; Chavez, A.; Chin, W.; Cuff, K. E.; Girton, C.; Haynes, D.; Kelly, G.; Leon, G.; Ramirez, J.; Ramirez, R.; Rodriquez, F.; Ruiz, D.; Torres, J.

    2009-12-01

    Based on past experiences, the potential for casualties and mass destruction that can result from a high magnitude earthquake are well known. Nevertheless, given the East San Francisco Bay Area’s proximity to the Hayward and San Andreas faults, learning about earthquakes and disaster preparedness is of particular importance. While basic educational programs and materials are available both through emergency relief agencies and schools, little research has been done on their effectiveness. Because of the wide socioeconomic spread between communities in the East Bay, we decided to investigate understandings of issues related to disaster and earthquake preparedness among local populations based upon average household income. To accomplish this, we created a survey that was later uploaded to and implemented using Palm Treo Smart Phones. Survey locations were selected in such a way that they reflected the understandings of residents in a diverse set of socio-economic settings. Thus, these locations included a grocery store and nearby plaza in the Fruitvale district of Oakland, CA (zip=94601; median household income= 33,152), as well as the nearby town of Alameda, CA (zip=94502, median household income= 87,855). Preliminary results suggest that in terms of the objective questions on the survey, people from Alameda who participated in our study performed significantly better (difference in percentage correct greater than 10%) than the people from Fruitvale on two of the advanced earthquake knowledge questions. Interestingly enough, people in Fruitvale significantly outperformed people in Alameda on two of the basic earthquake knowledge questions. The final important finding was that while houses in Alameda tended to be newer and more often retrofitted than houses in Fruitvale, the people of the latter location tended to have a higher percentage of respondents claim confidence in the ability of their house to withstand a major earthquake. Based on preliminary results we

  12. Automatic identification of fault zone head waves and direct P waves and its application in the Parkfield section of the San Andreas Fault, California

    Science.gov (United States)

    Li, Zefeng; Peng, Zhigang

    2016-06-01

    Fault zone head waves (FZHWs) are observed along major strike-slip faults and can provide high-resolution imaging of fault interface properties at seismogenic depth. In this paper, we present a new method to automatically detect FZHWs and pick direct P waves secondary arrivals (DWSAs). The algorithm identifies FZHWs by computing the amplitude ratios between the potential FZHWs and DSWAs. The polarities, polarizations and characteristic periods of FZHWs and DSWAs are then used to refine the picks or evaluate the pick quality. We apply the method to the Parkfield section of the San Andreas Fault where FZHWs have been identified before by manual picks. We compare results from automatically and manually picked arrivals and find general agreement between them. The obtained velocity contrast at Parkfield is generally 5-10 per cent near Middle Mountain while it decreases below 5 per cent near Gold Hill. We also find many FZHWs recorded by the stations within 1 km of the background seismicity (i.e. the Southwest Fracture Zone) that have not been reported before. These FZHWs could be generated within a relatively wide low velocity zone sandwiched between the fast Salinian block on the southwest side and the slow Franciscan Mélange on the northeast side. Station FROB on the southwest (fast) side also recorded a small portion of weak precursory signals before sharp P waves. However, the polarities of weak signals are consistent with the right-lateral strike-slip mechanisms, suggesting that they are unlikely genuine FZHW signals.

  13. Using low-frequency earthquake families on the San Andreas fault as deep creepmeters

    Science.gov (United States)

    Thomas, A.; Beeler, N. M.; Bletery, Q.; Burgmann, R.; Shelly, D. R.

    2017-12-01

    The San Andreas fault hosts tectonic tremor and low-frequency earthquakes (LFEs) similar to those in subduction zone environments. These LFEs are grouped into families based on waveform similarity and locate between 16 and 29 km depth along a 150-km-long section of the fault centered on Parkfield, CA. ­Within individual LFE families event occurrence is not steady. In some families, bursts of a few events recur on timescales of days while in other families there are nearly quiescent periods that often last for months followed by episodes where hundreds of events occur over the course of a few days. These two different styles of LFE occurrence are called continuous and episodic respectively. LFEs are often assumed to reflect persistent regions that periodically fail during the aseismic shear of the surrounding fault allowing them to be used as creepmeters. We test this idea by formalizing the definition of a creepmeter (the LFE occurrence rate is proportional to the local fault slip rate), determining whether this definition is consistent with the observations, and over what timescale. We use the recurrence intervals of LFEs within individual families to create a catalog of LFE bursts. For the episodic families, we consider both longer duration (multiday) inferred creep episodes (dubbed long-timescale episodic) as well as the frequent short-term bursts of events that occur many times during inferred creep episodes (dubbed short-timescale episodic). We then use the recurrence intervals of LFE bursts to estimate the timing, duration, recurrence interval, slip, and slip rate associated with inferred slow slip events. We find that continuous families and the short-timescale episodic families appear to be inconsistent with our definition of a creepmeter (defined on the recurrence interval timescale) because their estimated durations are not physically meaningful. A straight-forward interpretation of the frequent short-term bursts of the continuous and short

  14. Earthquake-induced soft-sediment deformations and seismically amplified erosion rates recorded in varved sediments of Köyceğiz Lake (SW Turkey)

    KAUST Repository

    Avsar, Ulas; Jonsson, Sigurjon; Avşar, Ö zgü r; Schmidt, Sabine

    2016-01-01

    sequence of Köyceğiz Lake (SW Turkey) that we compare with estimated peak ground acceleration (PGA) values of several nearby earthquakes. We find that earthquakes exceeding estimated PGA values of ca. 20 cm/s2 can induce soft-sediment deformations (SSD

  15. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  16. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  17. A Trial for Earthquake Prediction by Precise Monitoring of Deep Ground Water Temperature

    Science.gov (United States)

    Nasuhara, Y.; Otsuki, K.; Yamauchi, T.

    2006-12-01

    A near future large earthquake is estimated to occur off Miyagi prefecture, northeast Japan within 20 years at a probability of about 80 %. In order to predict this earthquake, we have observed groundwater temperature in a borehole at Sendai city 100 km west of the asperity. This borehole penetrates the fault zone of NE-trending active reverse fault, Nagamachi-Rifu fault zone, at 820m depth. Our concept of the ground water observation is that fault zones are natural amplifier of crustal strain, and hence at 820m depth we set a very precise quartz temperature sensor with the resolution of 0.0002 deg. C. We confirmed our observation system to work normally by both the pumping up tests and the systematic temperature changes at different depths. Since the observation started on June 20 in 2004, we found mysterious intermittent temperature fluctuations of two types; one is of a period of 5-10 days and an amplitude of ca. 0.1 deg. C, and the other is of a period of 11-21 days and an amplitude of ca. 0.2 deg. C. Based on the examination using the product of Grashof number and Prantl number, natural convection of water can be occurred in the borehole. However, since these temperature fluctuations are observed only at the depth around 820 m, thus it is likely that they represent the hydrological natures proper to the Nagamachi-Rifu fault zone. It is noteworthy that the small temperature changes correlatable with earth tide are superposed on the long term and large amplitude fluctuations. The amplitude on the days of the full moon and new moon is ca. 0.001 deg. C. The bottoms of these temperature fluctuations always delay about 6 hours relative to peaks of earth tide. This is interpreted as that water in the borehole is sucked into the fault zone on which tensional normal stress acts on the days of the full moon and new moon. The amplitude of the crustal strain by earth tide was measured at ca. 2∗10^-8 strain near our observation site. High frequency temperature noise of

  18. Earthquakes in Action: Incorporating Multimedia, Internet Resources, Large-scale Seismic Data, and 3-D Visualizations into Innovative Activities and Research Projects for Today's High School Students

    Science.gov (United States)

    Smith-Konter, B.; Jacobs, A.; Lawrence, K.; Kilb, D.

    2006-12-01

    ://siovizcenter.ucsd.edu/workshop). In addition to daily lecture and lab exercises, COSMOS students also conduct a mini-research project of their choice that uses data ranging from the 2004 Parkfield Earthquake, to Southern California seismicity, to global seismicity. Students collect seismic data from the Internet and evaluate earthquake locations, magnitudes, temporal sequence of seismic activity, active fault planes, and plate tectonic boundaries using research quality techniques. Students are given the opportunity to build 3-D visualizations of their research data sets and archive these at the SIO Visualization Center's online library, which is globally accessible to students, teachers, researchers, and the general public (http://www.siovizcenter.ucsd.edu/library.php). These student- generated visualizations have become a practical resource for not only students and teachers, but also geophysical researchers that use the visual objects as research tools to better explore and understand their data. Through Earthquakes in Action, we offer both the tools for scientific exploration and the thrills of scientific discovery, providing students with valuable knowledge, novel research experience, and a unique sense of scientific contribution.

  19. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  20. Fast Computation of Ground Motion Shaking Map base on the Modified Stochastic Finite Fault Modeling

    Science.gov (United States)

    Shen, W.; Zhong, Q.; Shi, B.

    2012-12-01

    ground shaking intensity, and the results of the comparisons between the simulated and observed MMI for the 2004 Mw 6.0 Parkfield earthquake, the 2008 Mw 7.9Wenchuan earthquake and the 1976 Mw 7.6Tangshan earthquake is fairly well. Take Parkfield earthquake as example, the simulative result reflect the directivity effect and the influence of the shallow velocity structure well. On the other hand, the simulative data is in good agreement with the network data and NGA (Next Generation Attenuation). The consumed time depends on the number of the subfaults and the number of the grid point. For the 2004 Mw 6.0 Parkfield earthquake, the grid size we calculated is 2.5° × 2.5°, the grid space is 0.025°, and the total time consumed is about 1.3hours. For the 2008 Mw 7.9 Wenchuan earthquake, the grid size calculated is 10° × 10°, the grid space is 0.05°, the total number of grid point is more than 40,000, and the total time consumed is about 7.5 hours. For t the 1976 Mw 7.6 Tangshan earthquake, the grid size we calculated is 4° × 6°, the grid space is 0.05°, and the total time consumed is about 2.1 hours. The CPU we used is 3.40GHz, and such computational time could further reduce by using GPU computing technique and other parallel computing technique. This is also our next focus.

  1. Active faults and historical earthquakes in the Messina Straits area (Ionian Sea

    Directory of Open Access Journals (Sweden)

    A. Polonia

    2012-07-01

    Full Text Available The Calabrian Arc (CA subduction complex is located at the toe of the Eurasian Plate in the Ionian Sea, where sediments resting on the lower plate have been scraped off and piled up in the accretionary wedge due to the African/Eurasian plate convergence and back arc extension. The CA has been struck repeatedly by destructive historical earthquakes, but knowledge of active faults and source parameters is relatively poor, particularly for seismogenic structures extending offshore. We analysed the fine structure of major tectonic features likely to have been sources of past earthquakes: (i the NNW–SSE trending Malta STEP (Slab Transfer Edge Propagator fault system, representing a lateral tear of the subduction system; (ii the out-of-sequence thrusts (splay faults at the rear of the salt-bearing Messinian accretionary wedge; and (iii the Messina Straits fault system, part of the wide deformation zone separating the western and eastern lobes of the accretionary wedge.

    Our findings have implications for seismic hazard in southern Italy, as we compile an inventory of first order active faults that may have produced past seismic events such as the 1908, 1693 and 1169 earthquakes. These faults are likely to be source regions for future large magnitude events as they are long, deep and bound sectors of the margin characterized by different deformation and coupling rates on the plate interface.

  2. Frictional melt generated by the 2008 Mw 7.9 Wenchuan earthquake and its faulting mechanisms

    Science.gov (United States)

    Wang, H.; Li, H.; Si, J.; Sun, Z.; Zhang, L.; He, X.

    2017-12-01

    Fault-related pseudotachylytes are considered as fossil earthquakes, conveying significant information that provide improved insight into fault behaviors and their mechanical properties. The WFSD project was carried out right after the 2008 Wenchuan earthquake, detailed research was conducted in the drilling cores. 2 mm rigid black layer with fresh slickenlines was observed at 732.6 m in WFSD-1 cores drilled at the southern Yingxiu-Beichuan fault (YBF). Evidence of optical microscopy, FESEM and FIB-TEM show it's frictional melt (pseudotachylyte). In the northern part of YBF, 4 mm fresh melt was found at 1084 m with similar structures in WFSD-4S cores. The melts contain numerous microcracks. Considering that (1) the highly unstable property of the frictional melt (easily be altered or devitrified) under geological conditions; (2) the unfilled microcracks; (3) fresh slickenlines and (4) recent large earthquake in this area, we believe that 2-4 mm melt was produced by the 2008 Wenchuan earthquake. This is the first report of fresh pseudotachylyte with slickenlines in natural fault that generated by modern earthquake. Geochemical analyses show that fault rocks at 732.6 m are enriched in CaO, Fe2O3, FeO, H2O+ and LOI, whereas depleted in SiO2. XRF results show that Ca and Fe are enriched obviously in the 2.5 cm fine-grained fault rocks and Ba enriched in the slip surface. The melt has a higher magnetic susceptibility value, which may due to neoformed magnetite and metallic iron formed in fault frictional melt. Frictional melt visible in both southern and northern part of YBF reveals that frictional melt lubrication played a major role in the Wenchuan earthquake. Instead of vesicles and microlites, numerous randomly oriented microcracks in the melt, exhibiting a quenching texture. The quenching texture suggests the frictional melt was generated under rapid heat-dissipation condition, implying vigorous fluid circulation during the earthquake. We surmise that during

  3. The costs and benefits of reconstruction options in Nepal using the CEDIM FDA modelled and empirical analysis following the 2015 earthquake

    Science.gov (United States)

    Daniell, James; Schaefer, Andreas; Wenzel, Friedemann; Khazai, Bijan; Girard, Trevor; Kunz-Plapp, Tina; Kunz, Michael; Muehr, Bernhard

    2016-04-01

    Over the days following the 2015 Nepal earthquake, rapid loss estimates of deaths and the economic loss and reconstruction cost were undertaken by our research group in conjunction with the World Bank. This modelling relied on historic losses from other Nepal earthquakes as well as detailed socioeconomic data and earthquake loss information via CATDAT. The modelled results were very close to the final death toll and reconstruction cost for the 2015 earthquake of around 9000 deaths and a direct building loss of ca. 3 billion (a). A description of the process undertaken to produce these loss estimates is described and the potential for use in analysing reconstruction costs from future Nepal earthquakes in rapid time post-event. The reconstruction cost and death toll model is then used as the base model for the examination of the effect of spending money on earthquake retrofitting of buildings versus complete reconstruction of buildings. This is undertaken future events using empirical statistics from past events along with further analytical modelling. The effects of investment vs. the time of a future event is also explored. Preliminary low-cost options (b) along the line of other country studies for retrofitting (ca. 100) are examined versus the option of different building typologies in Nepal as well as investment in various sectors of construction. The effect of public vs. private capital expenditure post-earthquake is also explored as part of this analysis, as well as spending on other components outside of earthquakes. a) http://www.scientificamerican.com/article/experts-calculate-new-loss-predictions-for-nepal-quake/ b) http://www.aees.org.au/wp-content/uploads/2015/06/23-Daniell.pdf

  4. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  5. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  6. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  7. Precise tremor source locations and amplitude variations along the lower-crustal central San Andreas Fault

    Science.gov (United States)

    Shelly, David R.; Hardebeck, Jeanne L.

    2010-01-01

    We precisely locate 88 tremor families along the central San Andreas Fault using a 3D velocity model and numerous P and S wave arrival times estimated from seismogram stacks of up to 400 events per tremor family. Maximum tremor amplitudes vary along the fault by at least a factor of 7, with by far the strongest sources along a 25 km section of the fault southeast of Parkfield. We also identify many weaker tremor families, which have largely escaped prior detection. Together, these sources extend 150 km along the fault, beneath creeping, transitional, and locked sections of the upper crustal fault. Depths are mostly between 18 and 28 km, in the lower crust. Epicenters are concentrated within 3 km of the surface trace, implying a nearly vertical fault. A prominent gap in detectible activity is located directly beneath the region of maximum slip in the 2004 magnitude 6.0 Parkfield earthquake.

  8. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  9. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  10. Stress coupling in the seismic cycle indicated from geodetic measurements

    Science.gov (United States)

    Wang, L.; Hainzl, S.; Zoeller, G.; Holschneider, M.

    2012-12-01

    The seismic cycle includes several phases, the interseismic, coseismic and postseismic phase. In the interseismic phase, strain gradually builds up around the overall locked fault in tens to thousands of years, while it is coseismically released in seconds. In the postseismic interval, stress relaxation lasts months to years, indicated by evident aseismic deformations which have been indicated to release comparable or even higher strain energy than the main shocks themselves. Benefiting from the development of geodetic observatory, e.g., Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) in the last two decades, the measurements of surface deformation have been significantly improved and become valuable information for understanding the stress evolution on the large fault plane. In this study, we utilize the GPS/InSAR data to investigate the slip deficit during the interseismic phase, the coseismic slip and the early postseismic creep on the fault plane. However, it is already well-known that slip inversions based only on the surface measurements are typically non-unique and subject to large uncertainties. To reduce the ambiguity, we utilize the assumption of stress coupling between interseismic and coseismic phases, and between coseismic and postseismic phases. We use a stress constrained joint inversion in Bayesian approach (Wang et al., 2012) to invert simultaneously for (1) interseismic slip deficit and coseismic slip, and (2) coseismic slip and postseismic creep. As case studies, we analyze earthquakes occurred in well-instrumented regions such as the 2004 M6.0 Parkfield earthquake, the 2010 M8.7 earthquake and the 2011 M9.1 Tohoku-Oki earthquake. We show that the inversion with the stress-coupling constraint leads to better constrained slip distributions. Meanwhile, the results also indicate that the assumed stress coupling is reasonable and can be well reflected from the available geodetic measurements. Reference: Lifeng

  11. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  12. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  13. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  14. Golden Gate Bridge response: a study with low-amplitude data from three earthquakes

    Science.gov (United States)

    Çelebi, Mehmet

    2012-01-01

    The dynamic response of the Golden Gate Bridge, located north of San Francisco, CA, has been studied previously using ambient vibration data and finite element models. Since permanent seismic instrumentation was installed in 1993, only small earthquakes that originated at distances varying between ~11 to 122 km have been recorded. Nonetheless, these records prompted this study of the response of the bridge to low amplitude shaking caused by three earthquakes. Compared to previous ambient vibration studies, the earthquake response data reveal a slightly higher fundamental frequency (shorter-period) for vertical vibration of the bridge deck center span (~7.7–8.3 s versus 8.2–10.6 s), and a much higher fundamental frequency (shorter period) for the transverse direction of the deck (~11.24–16.3 s versus ~18.2 s). In this study, it is also shown that these two periods are dominant apparent periods representing interaction between tower, cable, and deck.

  15. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  16. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  17. Evolution of the Earthquake Catalog in Central America

    Science.gov (United States)

    Rojas, W.; Camacho, E. I.; Marroquín, G.; Molina, E.; Talavera, E.; Benito, M. B.; Lindholm, C.

    2013-05-01

    Central America (CA) is known as a seismically active region in which several historic destructive earthquakes have occurred. This fact has promoved the development of seismic hazard studies that provide necessary estimates for decision making and risk assessment efforts, requiring a complete and standardized seismic catalog. With this aim, several authors have contributed to the study of the historical seismicity of Central America (e.g. Grases, Feldaman; White y Harlow, 1993; White et al. 2004; Ambraseys y Adams, 2001; Peraldo y Montero, 1999), who complied historical data. A first catalogue was developed by Rojas (1993) that comprises the 1522 to 1993 period. This information was integrated in 2007, together with data from the International Seismological Centre (CASC) and the national catalogs of CA countries in a new regional catalogue. Since 2007 a continuous effort has been done in order to complete and update this CA earthquake catalog. In particular, two workshops were held in 2008 and 2011 in the Universidad Politécnica de Madrid (Spain), joining experts from the different CA countries who worked each one in its own catalogue covering the entire region and the border with northwestern Colombia and southern Mexico. These national catalogues were later integrated in a common regional catalogue in SEISAN format. At this aim it was necessary to solve some problems, like to avoid duplicity of events, specially close to the boundaries, to consider the different scales of magnitude adopted by different countries, to take into account the completeness by the different national networks, etc. Some solutions were adopted for obtaining a homogenized catalogue to Mw, containing historical and instrumental events with Mw > 3.5 from 1522 up to 2011. The catalogue updated to December 2007 was the basis for the first regional hazard study carried out by Benito et al., (2011) as part of the collaborative RESIS II project under coordination of NORSAR. The ones updated to

  18. Diverse rupture modes for surface-deforming upper plate earthquakes in the southern Puget Lowland of Washington State

    Science.gov (United States)

    Nelson, Alan R.; Personius, Stephen F.; Sherrod, Brian L.; Kelsey, Harvey M.; Johnson, Samuel Y.; Bradley, Lee-Ann; Wells, Ray E.

    2014-01-01

    Earthquake prehistory of the southern Puget Lowland, in the north-south compressive regime of the migrating Cascadia forearc, reflects diverse earthquake rupture modes with variable recurrence. Stratigraphy and Bayesian analyses of previously reported and new 14C ages in trenches and cores along backthrust scarps in the Seattle fault zone restrict a large earthquake to 1040–910 cal yr B.P. (2σ), an interval that includes the time of the M 7–7.5 Restoration Point earthquake. A newly identified surface-rupturing earthquake along the Waterman Point backthrust dates to 940–380 cal yr B.P., bringing the number of earthquakes in the Seattle fault zone in the past 3500 yr to 4 or 5. Whether scarps record earthquakes of moderate (M 5.5–6.0) or large (M 6.5–7.0) magnitude, backthrusts of the Seattle fault zone may slip during moderate to large earthquakes every few hundred years for periods of 1000–2000 yr, and then not slip for periods of at least several thousands of years. Four new fault scarp trenches in the Tacoma fault zone show evidence of late Holocene folding and faulting about the time of a large earthquake or earthquakes inferred from widespread coseismic subsidence ca. 1000 cal yr B.P.; 12 ages from 8 sites in the Tacoma fault zone limit the earthquakes to 1050–980 cal yr B.P. Evidence is too sparse to determine whether a large earthquake was closely predated or postdated by other earthquakes in the Tacoma basin, but the scarp of the Tacoma fault was formed by multiple earthquakes. In the northeast-striking Saddle Mountain deformation zone, along the western limit of the Seattle and Tacoma fault zones, analysis of previous ages limits earthquakes to 1200–310 cal yr B.P. The prehistory clarifies earthquake clustering in the central Puget Lowland, but cannot resolve potential structural links among the three Holocene fault zones.

  19. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  20. Application of geochemical methods in earthquake prediction in China

    Energy Technology Data Exchange (ETDEWEB)

    Fong-liang, J.; Gui-ru, L.

    1981-05-01

    Several geochemical anomalies were observed before the Haichen, Longling, Tangshan, and Songpan earthquakes and their strong aftershocks. They included changes in groundwater radon levels; chemical composition of the groundwater (concentration of Ca/sup + +/, Mg/sup + +/, Cl/sup -/, So/sub 4//sup , and HCO/sub 3//sup -/ ions); conductivity; and dissolved gases such as H/sub 2/, CO/sub 2/, etc. In addition, anomalous changes in water color and quality were observed before these large earthquakes. Before some events gases escaped from the surface, and there were reports of ''ground odors'' being smelled by local residents. The large amount of radon data can be grouped into long-term and short-term anomalies. The long-term anomalies have a radon emission build up time of from a few months to more than a year. The short-term anomalies have durations from a few hours or less to a few months.

  1. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  2. Migrating tremors illuminate complex deformation beneath the seismogenic San Andreas fault.

    Science.gov (United States)

    Shelly, David R

    2010-02-04

    The San Andreas fault is one of the most extensively studied faults in the world, yet its physical character and deformation mode beneath the relatively shallow earthquake-generating portion remain largely unconstrained. Tectonic 'non-volcanic' tremor, a recently discovered seismic signal probably generated by shear slip on the deep extension of some major faults, can provide new insight into the deep fate of such faults, including that of the San Andreas fault near Parkfield, California. Here I examine continuous seismic data from mid-2001 to 2008, identifying tremor and decomposing the signal into different families of activity based on the shape and timing of the waveforms at multiple stations. This approach allows differentiation between activities from nearby patches of the deep fault and begins to unveil rich and complex patterns of tremor occurrence. I find that tremor exhibits nearly continuous migration, with the most extensive episodes propagating more than 20 kilometres along fault strike at rates of 15-80 kilometres per hour. This suggests that the San Andreas fault remains a localized through-going structure, at least to the base of the crust, in this area. Tremor rates and recurrence behaviour changed markedly in the wake of the 2004 magnitude-6.0 Parkfield earthquake, but these changes were far from uniform within the tremor zone, probably reflecting heterogeneous fault properties and static and dynamic stresses decaying away from the rupture. The systematic recurrence of tremor demonstrated here suggests the potential to monitor detailed time-varying deformation on this portion of the deep San Andreas fault, deformation which unsteadily loads the shallower zone that last ruptured in the 1857 magnitude-7.9 Fort Tejon earthquake.

  3. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  4. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  5. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  6. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  7. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  8. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  9. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  10. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  11. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  12. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  13. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  14. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  15. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  16. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  17. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  18. Imbricated slip rate processes during slow slip transients imaged by low-frequency earthquakes

    Science.gov (United States)

    Lengliné, O.; Frank, W.; Marsan, D.; Ampuero, J. P.

    2017-12-01

    Low Frequency Earthquakes (LFEs) often occur in conjunction with transient strain episodes, or Slow Slip Events (SSEs), in subduction zones. Their focal mechanism and location consistent with shear failure on the plate interface argue for a model where LFEs are discrete dynamic ruptures in an otherwise slowly slipping interface. SSEs are mostly observed by surface geodetic instruments with limited resolution and it is likely that only the largest ones are detected. The time synchronization of LFEs and SSEs suggests that we could use the recorded LFEs to constrain the evolution of SSEs, and notably of the geodetically-undetected small ones. However, inferring slow slip rate from the temporal evolution of LFE activity is complicated by the strong temporal clustering of LFEs. Here we apply dedicated statistical tools to retrieve the temporal evolution of SSE slip rates from the time history of LFE occurrences in two subduction zones, Mexico and Cascadia, and in the deep portion of the San Andreas fault at Parkfield. We find temporal characteristics of LFEs that are similar across these three different regions. The longer term episodic slip transients present in these datasets show a slip rate decay with time after the passage of the SSE front possibly as t-1/4. They are composed of multiple short term transients with steeper slip rate decay as t-α with α between 1.4 and 2. We also find that the maximum slip rate of SSEs has a continuous distribution. Our results indicate that creeping faults host intermittent deformation at various scales resulting from the imbricated occurrence of numerous slow slip events of various amplitudes.

  19. It's Our Fault: better defining earthquake risk in Wellington, New Zealand

    Science.gov (United States)

    Van Dissen, R.; Brackley, H. L.; Francois-Holden, C.

    2012-12-01

    The Wellington region, home of New Zealand's capital city, is cut by a number of major right-lateral strike slip faults, and is underlain by the currently locked west-dipping subduction interface between the down going Pacific Plate, and the over-riding Australian Plate. In its short historic period (ca. 160 years), the region has been impacted by large earthquakes on the strike-slip faults, but has yet to bear the brunt of a subduction interface rupture directly beneath the capital city. It's Our Fault is a comprehensive study of Wellington's earthquake risk. Its objective is to position the capital city of New Zealand to become more resilient through an encompassing study of the likelihood of large earthquakes, and the effects and impacts of these earthquakes on humans and the built environment. It's Our Fault is jointly funded by New Zealand's Earthquake Commission, Accident Compensation Corporation, Wellington City Council, Wellington Region Emergency Management Group, Greater Wellington Regional Council, and Natural Hazards Research Platform. The programme has been running for six years, and key results to date include better definition and constraints on: 1) location, size, timing, and likelihood of large earthquakes on the active faults closest to Wellington; 2) earthquake size and ground shaking characterization of a representative suite of subduction interface rupture scenarios under Wellington; 3) stress interactions between these faults; 4) geological, geotechnical, and geophysical parameterisation of the near-surface sediments and basin geometry in Wellington City and the Hutt Valley; and 5) characterisation of earthquake ground shaking behaviour in these two urban areas in terms of subsoil classes specified in the NZ Structural Design Standard. The above investigations are already supporting measures aimed at risk reduction, and collectively they will facilitate identification of additional actions that will have the greatest benefit towards further

  20. JGR special issue on Deep Earthquakes

    Science.gov (United States)

    The editor and associate editors of the Journal of Geophysical Research—Solid Earth and Planets invite the submission of manuscripts for a special issue on the topic “Deep- and Intermediate-Focus Earthquakes, Phase Transitions, and the Mechanics of Deep Subduction.”Manuscripts should be submitted to JGR Editor Gerald Schubert (Department of Earth and Space Sciences, University of California, Los Angeles, Los Angeles, CA 90024) before July 1, 1986, in accordance with the usual rules for manuscript submission. Submitted papers will undergo the normal JGR review procedure. For more information, contact either Schubert or the special guest associate editor, Cliff Frohlich (Institute for Geophysics, University of Texas at Austin, 4920 North IH-35, Austin, TX 78751; telephone: 512-451-6223).

  1. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  2. Toward Expanding Tremor Observations in the Northern San Andreas Fault System in the 1990s

    Science.gov (United States)

    Damiao, L. G.; Dreger, D. S.; Nadeau, R. M.; Taira, T.; Guilhem, A.; Luna, B.; Zhang, H.

    2015-12-01

    The connection between tremor activity and active fault processes continues to expand our understanding of deep fault zone properties and deformation, the tectonic process, and the relationship of tremor to the occurrence of larger earthquakes. Compared to tremors in subduction zones, known tremor signals in California are ~5 to ~10 smaller in amplitude and duration. These characteristics, in addition to scarce geographic coverage, lack of continuous data (e.g., before mid-2001 at Parkfield), and absence of instrumentation sensitive enough to monitor these events have stifled tremor detection. The continuous monitoring of these events over a relatively short time period in limited locations may lead to a parochial view of the tremor phenomena and its relationship to fault, tectonic, and earthquake processes. To help overcome this, we have embarked on a project to expand the geographic and temporal scope of tremor observation along the Northern SAF system using available continuous seismic recordings from a broad array of 100s of surface seismic stations from multiple seismic networks. Available data for most of these stations also extends back into the mid-1990s. Processing and analysis of tremor signal from this large and low signal-to-noise dataset requires a heavily automated, data-science type approach and specialized techniques for identifying and extracting reliable data. We report here on the automated, envelope based methodology we have developed. We finally compare our catalog results with pre-existing tremor catalogs in the Parkfield area.

  3. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  4. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  5. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  6. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  7. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    Science.gov (United States)

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  8. Sedimentary records of past earthquakes in Boraboy Lake during the last ca 600 years (North Anatolian Fault, Turkey)

    KAUST Repository

    Avsar, Ulas

    2015-05-21

    Multiproxy sedimentological analyses along 4.9 m-long sequence of Boraboy Lake, which is located on the central eastern part of the North Anatolian Fault (NAF), reveal the sedimentary traces of past large earthquakes in the region. The lake has a relatively large catchment area (10 km2) compared to its size (0.12 km2), which renders sedimentation sensitive to heavy rain/storm events. Accordingly, the background sedimentation, which is composed of faintly laminated reddish/yellowish brown clayey silt, is frequently interrupted by organic-rich intercalations probably due to heavy rain/storm events transporting terrestrial plant remains from the densely vegetated catchment. In addition to frequent organic-rich intercalations, the background sedimentation is interrupted by four mass-wasting deposits (MWD) of which thickness range between 15 and 50 cm. High-resolution ITRAX μXRF data confirms higher homogeneity along the MWDs (E1-E4) compared to the background sedimentation. Based on 137Cs and 210Pbxs dating and radiocarbon chronology, three MWDs detected in Boraboy sequence (E2, E3 and E4) temporally correlate with large historical earthquakes along the NAF; the 1943 Tosya (Ms= 7.6) and/or 1942 Niksar-Erbaa (Ms= 7.1), the 1776 Amasya-Merzifon and the 1668 North Anatolian (Ms= 7.9) earthquakes. The youngest MWD in the sequence (E1), which is dated to early 2000s, does not correlate with any strong earthquake in the region. This MWD was probably a single mass-wasting event due to routine overloading and oversteepening on the delta front formed by the main inlet of the lake. In subaqueous paleoseismology, coevality of multi-location mass-wasting events is used as a criterion to assign a seismic triggering mechanism, and to rule out mass-wasting events due to routine overloading/oversteepening of subaqueous slopes. Within this context, Boraboy sequence provides a valuable example to discuss sedimentological imprints of single- vs. multi-source MWDs.

  9. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  10. Outline and Prospects of SAFOD Project

    International Nuclear Information System (INIS)

    Malin, Peter

    2014-01-01

    SAFOD is a project to investigate and understand the rupture process of the San Andreas Fault through an investigation of material composition of the fault zone; observation to reveal underground structure; and measurement of physical properties, such as stresses on and around the fault, permeability, and pore pressure, etc. at Parkfield. In addition to observation by a seismometer and tilt-meter which were installed at a depth of about three kilometers in a borehole in the fault zone, core sampling and geophysical logging were carried out as part of this project. High quality data, such as that on guided waves (special seismic waves that propagate in the fault zone) and repeated earthquakes (earthquakes with very similar waveforms) were acquired. Such data can be very useful in understanding the fault rupture process. (authors)

  11. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    Science.gov (United States)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  12. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  13. A Seismo-Tectonic Signal From Offshore Sedimentation: The 2010 Haiti Earthquake and Prior Events

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Cormier, M.; Hornbach, M.; Momplaisir, R.; Waldhauser, F.; Sorlien, C. C.; Steckler, M. S.; Gulick, S.

    2011-12-01

    The Mw 7.0 January 2010 earthquake in Haiti was one of the deadliest in history. It involved multiple faults along or near the main Enriquillo-Plantain Garden Fault (EPGF). This left-lateral transform is a branch of the northern Caribbean plate boundary across southern Hispaniola. The main rupture was strike-slip but almost all aftershocks had thrust mechanisms, and surface deformation may have been concentrated on anticline forelimbs driven by blind thrust faults. Earthquake generated mass-wasting and turbidity currents were sampled from the Canal du Sud slope (~1000 m water depth), a basin at 1500 m, and the deepest part of the strait at 1700 m. The turbidites were strongly correlated by 234Th with a half-life of 24 days. In the deepest area, a turbidite-homogenite unit (T-H) extends over 50 km2 and is composed of basal sand beds 5 cm thick and 50 cm of mud above. The sedimentary structures in the sand were linked to oscillatory motions by internal seiches. The T-H units recovered from the slope and deep basin are similar in composition. The Leogane Delta, upslope from the sampling sites, is rich in this lithology that has been linked to oceanic basement rocks exposed on the southern Haitian peninsula. In contrast, the T-H unit recovered from the basin at 1500 m is perched behind a thrust anticline and has a greater concentration of Ca derived from Ca rich sources such as the Tapion Ridge on the southern peninsula. The Tapion Ridge is a compressional structure associated with a restraining bend along the EPGF. The T-H unit beneath the 2010 deposit has a 14C age of 2400 cal yrs BP, and interpreted as an earthquake triggered deposit. It is nearly identical in thickness, composition and fine structures to the 2010 T-H. Notably absent from the record are younger turbidites that could have been linked to the historic 1770 AD and other similar earthquakes expected from GPS rates across the EPGF. Two hypotheses are being considered for this long gap in T-H sedimentation

  14. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  15. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  16. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  17. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  18. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  19. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  20. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  1. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  2. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  3. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  4. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  5. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  6. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  7. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  8. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  9. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  10. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  11. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  12. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  13. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  14. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  15. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  16. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    Science.gov (United States)

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  17. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  18. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  19. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  20. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    Science.gov (United States)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  1. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    Science.gov (United States)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  2. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  3. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  4. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  5. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Science.gov (United States)

    Dong, Zhi-hui; Yang, Zhi-gang; Chen, Tian-wu; Chu, Zhi-gang; Deng, Wen; Shao, Heng

    2011-01-01

    PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT). METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR) = 2.2; pchest (45/143 vs. 11/66 patients, RR = 1.9; ptraumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries. PMID:21789386

  6. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  7. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  8. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  9. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  10. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  11. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  12. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  13. Comparison of aftershock sequences between 1975 Haicheng earthquake and 1976 Tangshan earthquake

    Science.gov (United States)

    Liu, B.

    2017-12-01

    The 1975 ML 7.3 Haicheng earthquake and the 1976 ML 7.8 Tangshan earthquake occurred in the same tectonic unit. There are significant differences in spatial-temporal distribution, number of aftershocks and time duration for the aftershock sequence followed by these two main shocks. As we all know, aftershocks could be triggered by the regional seismicity change derived from the main shock, which was caused by the Coulomb stress perturbation. Based on the rate- and state- dependent friction law, we quantitative estimated the possible aftershock time duration with a combination of seismicity data, and compared the results from different approaches. The results indicate that, aftershock time durations from the Tangshan main shock is several times of that form the Haicheng main shock. This can be explained by the significant relationship between aftershock time duration and earthquake nucleation history, normal stressand shear stress loading rateon the fault. In fact the obvious difference of earthquake nucleation history from these two main shocks is the foreshocks. 1975 Haicheng earthquake has clear and long foreshocks, while 1976 Tangshan earthquake did not have clear foreshocks. In that case, abundant foreshocks may mean a long and active nucleation process that may have changed (weakened) the rocks in the source regions, so they should have a shorter aftershock sequences for the reason that stress in weak rocks decay faster.

  14. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  15. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  16. What Can Sounds Tell Us About Earthquake Interactions?

    Science.gov (United States)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  17. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  18. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  19. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  20. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  1. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  2. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    Science.gov (United States)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    slope of the recurrence curve to forecast earthquakes in Colombia. Earth Sci. Res. J., 8, 3-9. Makropoulos, K., Kaviris, G., Kouskouna, V., 2012. An updated and extended earthquake catalogue for Greece and adjacent areas since 1900. Nat. Hazards Earth Syst. Sci., 12, 1425-1430. Papadakis, G., Vallianatos, F., Sammonds, P., 2013. Evidence of non extensive statistical physics behavior of the Hellenic Subduction Zone seismicity. Tectonophysics, 608, 1037-1048. Papaioannou, C.A., Papazachos, B.C., 2000. Time-independent and time-dependent seismic hazard in Greece based on seismogenic sources. Bull. Seismol. Soc. Am., 90, 22-33. Robertson, M.C., Sammis, C.G., Sahimi, M., Martin, A.J., 1995. Fractal analysis of three-dimensional spatial distributions of earthquakes with a percolation interpretation. J. Geophys. Res., 100, 609-620. Turcotte, D.L., 1997. Fractals and chaos in geology and geophysics. Second Edition, Cambridge University Press. Vallianatos, F., Michas, G., Papadakis, G., Sammonds, P., 2012. A non-extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece). Acta Geophys., 60, 758-768.

  3. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  4. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  5. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  6. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  7. Earthquake effect on the geological environment

    International Nuclear Information System (INIS)

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  8. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  9. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  10. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  11. Cyclic migration of weak earthquakes between Lunigiana earthquake of October 10, 1995 and Reggio Emilia earthquake of October 15, 1996 (Northern Italy)

    Science.gov (United States)

    di Giovambattista, R.; Tyupkin, Yu

    The cyclic migration of weak earthquakes (M 2.2) which occurred during the yearprior to the October 15, 1996 (M = 4.9) Reggio Emilia earthquake isdiscussed in this paper. The onset of this migration was associated with theoccurrence of the October 10, 1995 (M = 4.8) Lunigiana earthquakeabout 90 km southwest from the epicenter of the Reggio Emiliaearthquake. At least three series of earthquakes migrating from theepicentral area of the Lunigiana earthquake in the northeast direction wereobserved. The migration of earthquakes of the first series terminated at adistance of about 30 km from the epicenter of the Reggio Emiliaearthquake. The earthquake migration of the other two series halted atabout 10 km from the Reggio Emilia epicenter. The average rate ofearthquake migration was about 200-300 km/year, while the time ofrecurrence of the observed cycles varied from 68 to 178 days. Weakearthquakes migrated along the transversal fault zones and sometimesjumped from one fault to another. A correlation between the migratingearthquakes and tidal variations is analysed. We discuss the hypothesis thatthe analyzed area is in a state of stress approaching the limit of thelong-term durability of crustal rocks and that the observed cyclic migrationis a result of a combination of a more or less regular evolution of tectonicand tidal variations.

  12. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  13. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  14. Thoracic Injuries in earthquake-related versus non-earthquake-related trauma patients: differentiation via Multi-detector Computed Tomography

    Directory of Open Access Journals (Sweden)

    Zhi-hui Dong

    2011-01-01

    Full Text Available PURPOSE: Massive earthquakes are harmful to humankind. This study of a historical cohort aimed to investigate the difference between earthquake-related crush thoracic traumas and thoracic traumas unrelated to earthquakes using a multi-detector Computed Tomography (CT. METHODS: We retrospectively compared an earthquake-exposed cohort of 215 thoracic trauma crush victims of the Sichuan earthquake to a cohort of 215 non-earthquake-related thoracic trauma patients, focusing on the lesions and coexisting injuries to the thoracic cage and the pulmonary parenchyma and pleura using a multi-detector CT. RESULTS: The incidence of rib fracture was elevated in the earthquake-exposed cohort (143 vs. 66 patients in the non-earthquake-exposed cohort, Risk Ratio (RR = 2.2; p<0.001. Among these patients, those with more than 3 fractured ribs (106/143 vs. 41/66 patients, RR=1.2; p<0.05 or flail chest (45/143 vs. 11/66 patients, RR=1.9; p<0.05 were more frequently seen in the earthquake cohort. Earthquake-related crush injuries more frequently resulted in bilateral rib fractures (66/143 vs. 18/66 patients, RR= 1.7; p<0.01. Additionally, the incidence of non-rib fracture was higher in the earthquake cohort (85 vs. 60 patients, RR= 1.4; p<0.01. Pulmonary parenchymal and pleural injuries were more frequently seen in earthquake-related crush injuries (117 vs. 80 patients, RR=1.5 for parenchymal and 146 vs. 74 patients, RR = 2.0 for pleural injuries; p<0.001. Non-rib fractures, pulmonary parenchymal and pleural injuries had significant positive correlation with rib fractures in these two cohorts. CONCLUSIONS: Thoracic crush traumas resulting from the earthquake were life threatening with a high incidence of bony thoracic fractures. The ribs were frequently involved in bilateral and severe types of fractures, which were accompanied by non-rib fractures, pulmonary parenchymal and pleural injuries.

  15. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  16. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    Science.gov (United States)

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  17. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  18. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  19. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  20. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  1. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  2. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  3. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  4. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  5. Measuring the size of an earthquake

    Science.gov (United States)

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  6. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  7. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  8. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  9. Links Between Earthquake Characteristics and Subducting Plate Heterogeneity in the 2016 Pedernales Ecuador Earthquake Rupture Zone

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2016-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  10. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  11. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  12. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  13. Active Thrusting Offshore Mount Lebanon: Source of the Tsunamigenic A.D. 551 Beirut-Tripoli Earthquake

    Science.gov (United States)

    Tapponnier, P.; Elias, A.; Singh, S.; King, G.; Briais, A.; Daeron, M.; Carton, H.; Sursock, A.; Jacques, E.; Jomaa, R.; Klinger, Y.

    2007-12-01

    On July 9, AD 551, a large earthquake, followed by a tsunami destroyed most of the coastal cities of Phoenicia (modern-day Lebanon). This was arguably one of the most devastating historical submarine earthquakes in the eastern Mediterranean. Geophysical data from the Shalimar survey unveils the source of this Mw=7.5 event: rupture of the offshore, hitherto unknown, 100?150 km-long, active, east-dipping Mount Lebanon Thrust (MLT). Deep-towed sonar swaths along the base of prominent bathymetric escarpments reveal fresh, west facing seismic scarps that cut the sediment-smoothed seafloor. The MLT trace comes closest (~ 8 km) to the coast between Beirut and Enfeh, where as 13 radiocarbon-calibrated ages indicate, a shoreline-fringing Vermetid bench suddenly emerged by ~ 80 cm in the 6th century AD. At Tabarja, the regular vertical separation (~ 1 m) of higher fossil benches, suggests uplift by 3 more comparable-size earthquakes since the Holocene sea-level reached a maximum ca. 7-6 ka, implying a 1500?1750 yr recurrence time. Unabated thrusting on the MLT likely orchestrated the growth of Mt. Lebanon since the late Miocene. The newly discovered MLT has been the missing piece in the Dead Sea Transform and eastern Mediterranean tectonic scheme. Identifying the source of the AD 551 event thus ends a complete reassessment of the sources of the major historical earthquakes on the various faults of the Lebanese Restraining Bend of the Levant Fault System (or Dead Sea Transform).

  14. 33 CFR 222.4 - Reporting earthquake effects.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Reporting earthquake effects. 222..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... significant earthquakes. It primarily concerns damage surveys following the occurrences of earthquakes. (b...

  15. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  16. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  17. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  18. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  19. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  20. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  1. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    Science.gov (United States)

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  2. Stigma in science: the case of earthquake prediction.

    Science.gov (United States)

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  3. Impact of the Christchurch earthquakes on hospital staff.

    Science.gov (United States)

    Tovaranonte, Pleayo; Cawood, Tom J

    2013-06-01

    On September 4, 2010 a major earthquake caused widespread damage, but no loss of life, to Christchurch city and surrounding areas. There were numerous aftershocks, including on February 22, 2011 which, in contrast, caused substantial loss of life and major damage to the city. The research aim was to assess how these two earthquakes affected the staff in the General Medicine Department at Christchurch Hospital. Problem To date there have been no published data assessing the impact of this type of natural disaster on hospital staff in Australasia. A questionnaire that examined seven domains (demographics, personal impact, psychological impact, emotional impact, impact on care for patients, work impact, and coping strategies) was handed out to General Medicine staff and students nine days after the September 2010 earthquake and 14 days after the February 2011 earthquake. Response rates were ≥ 99%. Sixty percent of responders were earthquakes, respectively. A fifth to a third of people had to find an alternative route of transport to get to work but only eight percent to 18% took time off work. Financial impact was more severe following the February earthquake, with 46% reporting damage of >NZ $1,000, compared with 15% following the September earthquake (P earthquake than the September earthquake (42% vs 69%, P earthquake but this rose to 53% after the February earthquake (12/53 vs 45/85, P earthquake but this dropped significantly to 15% following the February earthquake (27/53 vs 13/62, P earthquakes upon General Medicine hospital staff. The effect was widespread with minor financial impact during the first but much more during the second earthquake. Moderate psychological impact was experienced in both earthquakes. This data may be useful to help prepare plans for future natural disasters. .

  4. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  5. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  6. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  7. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  8. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  9. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  10. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  11. Exploring Earthquakes in Real-Time

    Science.gov (United States)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  12. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  13. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  14. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    Science.gov (United States)

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  15. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  16. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    Science.gov (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  17. Earthquake evaluation of a substation network

    International Nuclear Information System (INIS)

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  18. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  19. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  20. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  1. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  2. Accelerations from the September 5, 2012 (Mw=7.6) Nicoya, Costa Rica Earthquake

    Science.gov (United States)

    Simila, G. W.; Quintero, R.; Burgoa, B.; Mohammadebrahim, E.; Segura, J.

    2013-05-01

    Since 1984, the Seismic Network of the Volcanological and Seismological Observatory of Costa Rica, Universidad Nacional (OVSICORI-UNA) has been recording and registering the seismicity in Costa Rica. Before September 2012, the earthquakes registered by this seismic network in northwestern Costa Rica were moderate to small, except the Cóbano earthquake of March 25, 1990, 13:23, Mw 7.3, lat. 9.648, long. 84.913, depth 20 km; a subduction quake at the entrance of the Gulf of Nicoya and generated peak intensities in the range of MM = VIII near the epicentral area and VI-VII in the Central Valley of Costa Rica. Six years before the installation of the seismic network, OVSICORI-UNA registered two subduction earthquakes in northwestern Costa Rica, specifically on August 23, 1978, at 00:38:32 and 00:50:29 with magnitudes Mw 7.0 (HRVD), Ms 7.0 (ISC) and depths of 58 and 69 km, respectively (EHB Bulletin). On September 5, 2012, at 14:42:02.8 UTC, the seismic network OVSICORI-UNA registered another large subduction earthquake in Nicoya peninsula, northwestern Costa Rica, located 29 km south of Samara, with a depth of 21 km and magnitude Mw 7.6, lat. 9.6392, long. 85.6167. This earthquake was caused by the subduction of the Cocos plate under the Caribbean plate in northwestern Costa Rica. This earthquake was felt throughout the country and also in much of Nicaragua. The instrumental intensity map for the Nicoya earthquake indicates that the earthquake was felt with an intensity of VII-VIII in the Puntarenas and Nicoya Peninsulas, in an area between Liberia, Cañas, Puntarenas, Cabo Blanco, Carrillo, Garza, Sardinal, and Tamarindo in Guanacaste; Nicoya city being the place where the maximum reported intensity of VIII is most notable. An intensity of VIII indicates that damage estimates are moderate to severe, and intensity VII indicates that damage estimates are moderate. According to the National Emergency Commission of Costa Rica, 371 affected communities were reported; most

  3. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  4. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  5. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  6. Earthquake-induced soft-sediment deformations and seismically amplified erosion rates recorded in varved sediments of Köyceğiz Lake (SW Turkey)

    KAUST Repository

    Avsar, Ulas

    2016-06-06

    Earthquake-triggered landslides amplify erosion rates in catchments, i.e. catchment response to seismic shocks (CR). In addition to historical eyewitness accounts of muddy rivers implying CRs after large earthquakes, several studies have quantitatively reported increased sediment concentrations in rivers after earthquakes. However, only a few paleolimnological studies could detect CRs within lacustrine sedimentary sequences as siliciclastic-enriched intercalations within background sedimentation. Since siliciclastic-enriched intercalations can easily be of non-seismic origin, their temporal correlation with nearby earthquakes is crucial to assign a seismic triggering mechanism. In most cases, either uncertainties in dating methods or the lack of recent seismic activity has prevented reliable temporal correlations, making the seismic origin of observed sedimentary events questionable. Here, we attempt to remove this question mark by presenting sedimentary traces of CRs in the 370-year-long varved sequence of Köyceğiz Lake (SW Turkey) that we compare with estimated peak ground acceleration (PGA) values of several nearby earthquakes. We find that earthquakes exceeding estimated PGA values of ca. 20 cm/s2 can induce soft-sediment deformations (SSD), while CRs seem only to be triggered by PGA levels higher than 70 cm/s2. In Köyceğiz Lake, CRs produce Cr- and Ni-enriched sedimentation due to the seismically mobilized soils derived from ultramafic rocks in the catchment. Given the varve chronology, the residence time of the seismically mobilized material in the catchment is determined to be 5 to 10 years.

  7. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  8. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  9. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  10. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  11. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Science.gov (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  12. Application of τc*Pd for identifying damaging earthquakes for earthquake early warning

    Science.gov (United States)

    Huang, P. L.; Lin, T. L.; Wu, Y. M.

    2014-12-01

    Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.

  13. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  14. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  15. Temporal stress changes caused by earthquakes: A review

    Science.gov (United States)

    Hardebeck, Jeanne L.; Okada, Tomomi

    2018-01-01

    Earthquakes can change the stress field in the Earth’s lithosphere as they relieve and redistribute stress. Earthquake-induced stress changes have been observed as temporal rotations of the principal stress axes following major earthquakes in a variety of tectonic settings. The stress changes due to the 2011 Mw9.0 Tohoku-Oki, Japan, earthquake were particularly well documented. Earthquake stress rotations can inform our understanding of earthquake physics, most notably addressing the long-standing problem of whether the Earth’s crust at plate boundaries is “strong” or “weak.” Many of the observed stress rotations, including that due to the Tohoku-Oki earthquake, indicate near-complete stress drop in the mainshock. This implies low background differential stress, on the order of earthquake stress drop, supporting the weak crust model. Earthquake stress rotations can also be used to address other important geophysical questions, such as the level of crustal stress heterogeneity and the mechanisms of postseismic stress reloading. The quantitative interpretation of stress rotations is evolving from those based on simple analytical methods to those based on more sophisticated numerical modeling that can capture the spatial-temporal complexity of the earthquake stress changes.

  16. Remote Triggering of the Mw 6.9 Hokkaido Earthquake as a Result of the Mw 6.6 Indonesian Earthquake on September 11, 2008

    Directory of Open Access Journals (Sweden)

    Cheng-Horng Lin

    2012-01-01

    Full Text Available Only just recently, the phenomenon of earthquakes being triggered by a distant earthquake has been well established. Yet, most of the triggered earthquakes have been limited to small earthquakes (M < 3. Also, the exact triggering mechanism for earthquakes is still not clear. Here I show how one strong earthquake (Mw = 6.6 is capable of triggering another (Mw = 6.9 at a remote distance (~4750 km. On September 11, 2008, two strong earthquakes with magnitudes (Mw of 6.6 and 6.9 hit respectively in Indonesia and Japan within a short interval of ~21 minutes time. Careful examination of broadband seismograms recorded in Japan shows that the Hokkaido earthquake occurred just as the surface waves generated by the Indonesia earthquake arrived. Although the peak dynamic stress estimated at the focus of the Hokkaido earthquake was just reaching the lower bound for the capability of triggering earthquakes in general, a more plausible mechanism for triggering an earthquake might be attributed to the change of a fault property by fluid infiltration. These observations suggest that the Hokkaido earthquake was likely triggered from a remote distance by the surface waves generated from the Indonesia earthquake. If some more cases can be observed, a temporal warning of possible interaction between strong earthquakes might be concerned in the future.

  17. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    Science.gov (United States)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  18. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  19. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  20. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  1. Inter-Disciplinary Validation of Pre Earthquake Signals. Case Study for Major Earthquakes in Asia (2004-2010) and for 2011 Tohoku Earthquake

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Hattori, K.; Liu, J.-Y.; Yang. T. Y.; Parrot, M.; Kafatos, M.; Taylor, P.

    2012-01-01

    We carried out multi-sensors observations in our investigation of phenomena preceding major earthquakes. Our approach is based on a systematic analysis of several physical and environmental parameters, which we found, associated with the earthquake processes: thermal infrared radiation, temperature and concentration of electrons in the ionosphere, radon/ion activities, and air temperature/humidity in the atmosphere. We used satellite and ground observations and interpreted them with the Lithosphere-Atmosphere- Ionosphere Coupling (LAIC) model, one of possible paradigms we study and support. We made two independent continues hind-cast investigations in Taiwan and Japan for total of 102 earthquakes (M>6) occurring from 2004-2011. We analyzed: (1) ionospheric electromagnetic radiation, plasma and energetic electron measurements from DEMETER (2) emitted long-wavelength radiation (OLR) from NOAA/AVHRR and NASA/EOS; (3) radon/ion variations (in situ data); and 4) GPS Total Electron Content (TEC) measurements collected from space and ground based observations. This joint analysis of ground and satellite data has shown that one to six (or more) days prior to the largest earthquakes there were anomalies in all of the analyzed physical observations. For the latest March 11 , 2011 Tohoku earthquake, our analysis shows again the same relationship between several independent observations characterizing the lithosphere /atmosphere coupling. On March 7th we found a rapid increase of emitted infrared radiation observed from satellite data and subsequently an anomaly developed near the epicenter. The GPS/TEC data indicated an increase and variation in electron density reaching a maximum value on March 8. Beginning from this day we confirmed an abnormal TEC variation over the epicenter in the lower ionosphere. These findings revealed the existence of atmospheric and ionospheric phenomena occurring prior to the 2011 Tohoku earthquake, which indicated new evidence of a distinct

  2. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  3. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    Science.gov (United States)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  4. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  5. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  6. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  7. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation.

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-05-10

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011.

  8. Possible scenarios for occurrence of M ~ 7 interplate earthquakes prior to and following the 2011 Tohoku-Oki earthquake based on numerical simulation

    Science.gov (United States)

    Nakata, Ryoko; Hori, Takane; Hyodo, Mamoru; Ariyoshi, Keisuke

    2016-01-01

    We show possible scenarios for the occurrence of M ~ 7 interplate earthquakes prior to and following the M ~ 9 earthquake along the Japan Trench, such as the 2011 Tohoku-Oki earthquake. One such M ~ 7 earthquake is so-called the Miyagi-ken-Oki earthquake, for which we conducted numerical simulations of earthquake generation cycles by using realistic three-dimensional (3D) geometry of the subducting Pacific Plate. In a number of scenarios, the time interval between the M ~ 9 earthquake and the subsequent Miyagi-ken-Oki earthquake was equal to or shorter than the average recurrence interval during the later stage of the M ~ 9 earthquake cycle. The scenarios successfully reproduced important characteristics such as the recurrence of M ~ 7 earthquakes, coseismic slip distribution, afterslip distribution, the largest foreshock, and the largest aftershock of the 2011 earthquake. Thus, these results suggest that we should prepare for future M ~ 7 earthquakes in the Miyagi-ken-Oki segment even though this segment recently experienced large coseismic slip in 2011. PMID:27161897

  9. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  10. Radon, gas geochemistry, groundwater, and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    King, Chi-Yu [Power Reactor and Nuclear Fuel Development Corp., Tono Geoscience Center, Toki, Gifu (Japan)

    1998-12-31

    Radon monitoring in groundwater, soil air, and atmosphere has been continued in many seismic areas of the world for earthquake-prediction and active-fault studies. Some recent measurements of radon and other geochemical and hydrological parameters have been made for sufficiently long periods, with reliable instruments, and together with measurements of meteorological variables and solid-earth tides. The resultant data are useful in better distinguishing earthquake-related changes from various background noises. Some measurements have been carried out in areas where other geophysical measurements are being made also. Comparative studies of various kinds of geophysical data are helpful in ascertaining the reality of the earthquake-related and fault-related anomalies and in understanding the underlying mechanisms. Spatial anomalies of radon and other terrestrial gasses have been observed for many active faults. Such observations indicate that gas concentrations are very much site dependent, particularly on fault zones where terrestrial fluids may move vertically. Temporal anomalies have been reliably observed before and after some recent earthquakes, including the 1995 Kobe earthquake, and the general pattern of anomaly occurrence remains the same as observed before: They are recorded at only relatively few sensitive sites, which can be at much larger distances than expected from existing earthquake-source models. The sensitivity of a sensitive site is also found to be changeable with time. These results clearly show the inadequacy of the existing dilatancy-fluid diffusion and elastic-dislocation models for earthquake sources to explain earthquake-related geochemical and geophysical changes recorded at large distances. (J.P.N.)

  11. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Source modeling of the 2015 Mw 7.8 Nepal (Gorkha) earthquake sequence: Implications for geodynamics and earthquake hazards

    Science.gov (United States)

    McNamara, D. E.; Yeck, W. L.; Barnhart, W. D.; Schulte-Pelkum, V.; Bergman, E.; Adhikari, L. B.; Dixit, A.; Hough, S. E.; Benz, H. M.; Earle, P. S.

    2017-09-01

    The Gorkha earthquake on April 25th, 2015 was a long anticipated, low-angle thrust-faulting event on the shallow décollement between the India and Eurasia plates. We present a detailed multiple-event hypocenter relocation analysis of the Mw 7.8 Gorkha Nepal earthquake sequence, constrained by local seismic stations, and a geodetic rupture model based on InSAR and GPS data. We integrate these observations to place the Gorkha earthquake sequence into a seismotectonic context and evaluate potential earthquake hazard. Major results from this study include (1) a comprehensive catalog of calibrated hypocenters for the Gorkha earthquake sequence; (2) the Gorkha earthquake ruptured a 150 × 60 km patch of the Main Himalayan Thrust (MHT), the décollement defining the plate boundary at depth, over an area surrounding but predominantly north of the capital city of Kathmandu (3) the distribution of aftershock seismicity surrounds the mainshock maximum slip patch; (4) aftershocks occur at or below the mainshock rupture plane with depths generally increasing to the north beneath the higher Himalaya, possibly outlining a 10-15 km thick subduction channel between the overriding Eurasian and subducting Indian plates; (5) the largest Mw 7.3 aftershock and the highest concentration of aftershocks occurred to the southeast the mainshock rupture, on a segment of the MHT décollement that was positively stressed towards failure; (6) the near surface portion of the MHT south of Kathmandu shows no aftershocks or slip during the mainshock. Results from this study characterize the details of the Gorkha earthquake sequence and provide constraints on where earthquake hazard remains high, and thus where future, damaging earthquakes may occur in this densely populated region. Up-dip segments of the MHT should be considered to be high hazard for future damaging earthquakes.

  13. Plant state display device after occurrence of earthquake

    International Nuclear Information System (INIS)

    Kitada, Yoshio; Yonekura, Kazuyoshi.

    1992-01-01

    If a nuclear power plant should encounter earthquakes, an earthquake response analysis value previously stored and the earthquakes observed are compared to judge the magnitude of the earthquakes. From the result of the judgement, a possibility that an abnormality is recognized in plant equipment systems after the earthquakes is evaluated, in comparison with a previously stored earthquake fragility data base of each of equipment/systems. The result of the evaluation is displayed in a central control chamber. The plant equipment system is judged such that abnormalities are recognized at a high probability is evaluated by a previously stored earthquake PSA method for the influence of the abnormality on plant safety, and the result is displayed in the central control chamber. (I.S.)

  14. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  15. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  16. Seismic-resistant design of nuclear power stations in Japan, earthquake country. Lessons learned from Chuetsu-oki earthquake

    International Nuclear Information System (INIS)

    Irikura, Kojiro

    2008-01-01

    The new assessment (back-check) of earthquake-proof safety was being conducted at Kashiwazaki-Kariwa Nuclear Power Plants, Tokyo Electric Co. in response to a request based on the guideline for reactor evaluation for seismic-resistant design code, revised in 2006, when the 2007 Chuetsu-oki Earthquake occurred and brought about an unexpectedly huge tremor in this area, although the magnitude of the earthquake was only 6.8 but the intensity of earthquake motion exceeded 2.5-fold more than supposed. This paper introduces how and why the guideline for seismic-resistant design of nuclear facilities was revised in 2006, the outline of the Chuetsu-oki Earthquake, and preliminary findings and lessons learned from the Earthquake. The paper specifically discusses on (1) how we may specify in advance geologic active faults as has been overlooked this time, (2) how we can make adequate models for seismic origin from which we can extract its characteristics, and (3) how the estimation of strong ground motion simulation may be possible for ground vibration level of a possibly overlooked fault. (S. Ohno)

  17. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  18. Ecological implications of extreme events: footprints of the 2010 earthquake along the Chilean coast.

    Science.gov (United States)

    Jaramillo, Eduardo; Dugan, Jenifer E; Hubbard, David M; Melnick, Daniel; Manzano, Mario; Duarte, Cristian; Campos, Cesar; Sanchez, Roland

    2012-01-01

    Deciphering ecological effects of major catastrophic events such as earthquakes, tsunamis, volcanic eruptions, storms and fires, requires rapid interdisciplinary efforts often hampered by a lack of pre-event data. Using results of intertidal surveys conducted shortly before and immediately after Chile's 2010 M(w) 8.8 earthquake along the entire rupture zone (ca. 34-38°S), we provide the first quantification of earthquake and tsunami effects on sandy beach ecosystems. Our study incorporated anthropogenic coastal development as a key design factor. Ecological responses of beach ecosystems were strongly affected by the magnitude of land-level change. Subsidence along the northern rupture segment combined with tsunami-associated disturbance and drowned beaches. In contrast, along the co-seismically uplifted southern rupture, beaches widened and flattened increasing habitat availability. Post-event changes in abundance and distribution of mobile intertidal invertebrates were not uniform, varying with land-level change, tsunami height and coastal development. On beaches where subsidence occurred, intertidal zones and their associated species disappeared. On some beaches, uplift of rocky sub-tidal substrate eliminated low intertidal sand beach habitat for ecologically important species. On others, unexpected interactions of uplift with man-made coastal armouring included restoration of upper and mid-intertidal habitat seaward of armouring followed by rapid colonization of mobile crustaceans typical of these zones formerly excluded by constraints imposed by the armouring structures. Responses of coastal ecosystems to major earthquakes appear to vary strongly with land-level change, the mobility of the biota and shore type. Our results show that interactions of extreme events with human-altered shorelines can produce surprising ecological outcomes, and suggest these complex responses to landscape alteration can leave lasting footprints in coastal ecosystems.

  19. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  20. <> earthquakes: a growing contribution to the Catalogue of Strong Italian Earthquakes

    Directory of Open Access Journals (Sweden)

    E. Guidoboni

    2000-06-01

    Full Text Available The particular structure of the research into historical seismology found in this catalogue has allowed a lot of information about unknown seismic events to be traced. This new contribution to seismologic knowledge mainly consists in: i the retrieval and organisation within a coherent framework of documentary evidence of earthquakes that took place between the Middle Ages and the sixteenth century; ii the improved knowledge of seismic events, even destructive events, which in the past had been "obscured" by large earthquakes; iii the identification of earthquakes in "silent" seismic areas. The complex elements to be taken into account when dealing with unknown seismic events have been outlined; much "new" information often falls into one of the following categories: simple chronological errors relative to other well-known events; descriptions of other natural phenomena, though defined in texts as "earthquakes" (landslides, hurricanes, tornadoes, etc.; unknown tremors belonging to known seismic periods; tremors that may be connected with events which have been catalogued under incorrect dates and with very approximate estimates of location and intensity. This proves that this was not a real seismic "silence" but a research vacuum.

  1. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  2. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  3. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  4. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  5. Evaluating spatial and temporal relationships between an earthquake cluster near Entiat, central Washington, and the large December 1872 Entiat earthquake

    Science.gov (United States)

    Brocher, Thomas M.; Blakely, Richard J.; Sherrod, Brian

    2017-01-01

    We investigate spatial and temporal relations between an ongoing and prolific seismicity cluster in central Washington, near Entiat, and the 14 December 1872 Entiat earthquake, the largest historic crustal earthquake in Washington. A fault scarp produced by the 1872 earthquake lies within the Entiat cluster; the locations and areas of both the cluster and the estimated 1872 rupture surface are comparable. Seismic intensities and the 1–2 m of coseismic displacement suggest a magnitude range between 6.5 and 7.0 for the 1872 earthquake. Aftershock forecast models for (1) the first several hours following the 1872 earthquake, (2) the largest felt earthquakes from 1900 to 1974, and (3) the seismicity within the Entiat cluster from 1976 through 2016 are also consistent with this magnitude range. Based on this aftershock modeling, most of the current seismicity in the Entiat cluster could represent aftershocks of the 1872 earthquake. Other earthquakes, especially those with long recurrence intervals, have long‐lived aftershock sequences, including the Mw">MwMw 7.5 1891 Nobi earthquake in Japan, with aftershocks continuing 100 yrs after the mainshock. Although we do not rule out ongoing tectonic deformation in this region, a long‐lived aftershock sequence can account for these observations.

  6. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  7. Surface latent heat flux as an earthquake precursor

    Directory of Open Access Journals (Sweden)

    S. Dey

    2003-01-01

    Full Text Available The analysis of surface latent heat flux (SLHF from the epicentral regions of five recent earthquakes that occurred in close proximity to the oceans has been found to show anomalous behavior. The maximum increase of SLHF is found 2–7 days prior to the main earthquake event. This increase is likely due to an ocean-land-atmosphere interaction. The increase of SLHF prior to the main earthquake event is attributed to the increase in infrared thermal (IR temperature in the epicentral and surrounding region. The anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the tides and monsoon in surface latent heat flux. Efforts have been made to understand the level of background noise in the epicentral regions of the five earthquakes considered in the present paper. A comparison of SLHF from the epicentral regions over the coastal earthquakes and the earthquakes that occurred far away from the coast has been made and it has been found that the anomalous behavior of SLHF prior to the main earthquake event is only associated with the coastal earthquakes.

  8. Method to Determine Appropriate Source Models of Large Earthquakes Including Tsunami Earthquakes for Tsunami Early Warning in Central America

    Science.gov (United States)

    Tanioka, Yuichiro; Miranda, Greyving Jose Arguello; Gusman, Aditya Riadi; Fujii, Yushiro

    2017-08-01

    Large earthquakes, such as the Mw 7.7 1992 Nicaragua earthquake, have occurred off the Pacific coasts of El Salvador and Nicaragua in Central America and have generated distractive tsunamis along these coasts. It is necessary to determine appropriate fault models before large tsunamis hit the coast. In this study, first, fault parameters were estimated from the W-phase inversion, and then an appropriate fault model was determined from the fault parameters and scaling relationships with a depth dependent rigidity. The method was tested for four large earthquakes, the 1992 Nicaragua tsunami earthquake (Mw7.7), the 2001 El Salvador earthquake (Mw7.7), the 2004 El Astillero earthquake (Mw7.0), and the 2012 El Salvador-Nicaragua earthquake (Mw7.3), which occurred off El Salvador and Nicaragua in Central America. The tsunami numerical simulations were carried out from the determined fault models. We found that the observed tsunami heights, run-up heights, and inundation areas were reasonably well explained by the computed ones. Therefore, our method for tsunami early warning purpose should work to estimate a fault model which reproduces tsunami heights near the coast of El Salvador and Nicaragua due to large earthquakes in the subduction zone.

  9. Earthquake simulation, actual earthquake monitoring and analytical methods for soil-structure interaction investigation

    Energy Technology Data Exchange (ETDEWEB)

    Tang, H T [Seismic Center, Electric Power Research Institute, Palo Alto, CA (United States)

    1988-07-01

    Approaches for conducting in-situ soil-structure interaction experiments are discussed. High explosives detonated under the ground can generate strong ground motion to induce soil-structure interaction (SSI). The explosive induced data are useful in studying the dynamic characteristics of the soil-structure system associated with the inertial aspect of the SSI problem. The plane waves generated by the explosives cannot adequately address the kinematic interaction associated with actual earthquakes because of he difference in wave fields and their effects. Earthquake monitoring is ideal for obtaining SSI data that can address all aspects of the SSI problem. The only limitation is the level of excitation that can be obtained. Neither the simulated earthquake experiments nor the earthquake monitoring experiments can have exact similitude if reduced scale test structures are used. If gravity effects are small, reasonable correlations between the scaled model and the prototype can be obtained provided that input motion can be scaled appropriately. The key product of the in-situ experiments is the data base that can be used to qualify analytical methods for prototypical applications. (author)

  10. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  11. Critical behavior in earthquake energy dissipation

    Science.gov (United States)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  12. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  13. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  14. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  15. Ultra-high resolution four dimensional geodetic imaging of engineered structures for stability assessment

    Science.gov (United States)

    Bawden, Gerald W.; Bond, Sandra; Podoski, J. H.; Kreylos, O.; Kellogg, L. H.

    2012-01-01

    We used ground-based Tripod LiDAR (T-LiDAR) to assess the stability of two engineered structures: a bridge spanning the San Andreas fault following the M6.0 Parkfield earthquake in Central California and a newly built coastal breakwater located at the Kaumālapa`u Harbor Lana'i, Hawaii. In the 10 weeks following the earthquake, we found that the surface under the bridge shifted 7.1 cm with an additional 2.6 cm of motion in the subsequent 13 weeks, which deflected the bridge's northern I-beam support 4.3 cm and 2.1 respectively; the bridge integrity remained intact. T-LiDAR imagery was collected after the completion of armored breakwater with 817 35-ton interlocking concrete armor units, Core-Locs®, in the summers of 2007, 2008 and 2010. We found a wide range of motion of individual Core-Locs, from a few centimeters to >110 cm along the ocean side of the breakwater, with lesser movement along the harbor side.

  16. U.S.-Japan Quake Prediction Research

    Science.gov (United States)

    Kisslinger, Carl; Mikumo, Takeshi; Kanamori, Hiroo

    For the seventh time since 1964, a seminar on earthquake prediction has been convened under the U.S.-Japan Cooperation in Science Program. The purpose of the seminar was to provide an opportunity for researchers from the two countries to share recent progress and future plans in the continuing effort to develop the scientific basis for predicting earthquakes and practical means for implementing prediction technology as it emerges. Thirty-six contributors, 15 from Japan and 21 from the U.S., met in Morro Bay, Calif.September 12-14. The following day they traveled to nearby sections of the San Andreas fault, including the site of the Parkfield prediction experiment. The conveners of the seminar were Hiroo Kanamori, Seismological Laboratory, California Institute of Technology (Caltech), for the U.S., and Takeshi Mikumo, Disaster Prevention Research Institute, Kyoto University, for Japan . Funding for the participants came from the U.S. National Science Foundation and the Japan Society forthe Promotion of Science, supplemented by other agencies in both countries.

  17. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  18. Nonlinear acoustic/seismic waves in earthquake processes

    International Nuclear Information System (INIS)

    Johnson, Paul A.

    2012-01-01

    Nonlinear dynamics induced by seismic sources and seismic waves are common in Earth. Observations range from seismic strong ground motion (the most damaging aspect of earthquakes), intense near-source effects, and distant nonlinear effects from the source that have important consequences. The distant effects include dynamic earthquake triggering—one of the most fascinating topics in seismology today—which may be elastically nonlinearly driven. Dynamic earthquake triggering is the phenomenon whereby seismic waves generated from one earthquake trigger slip events on a nearby or distant fault. Dynamic triggering may take place at distances thousands of kilometers from the triggering earthquake, and includes triggering of the entire spectrum of slip behaviors currently identified. These include triggered earthquakes and triggered slow, silent-slip during which little seismic energy is radiated. It appears that the elasticity of the fault gouge—the granular material located between the fault blocks—is key to the triggering phenomenon.

  19. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  20. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  1. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.

    1999-01-01

    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  2. Earthquake activity along the Himalayan orogenic belt

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  3. Medical Requirements During a Natural Disaster: A Case Study on WhatsApp Chats Among Medical Personnel During the 2015 Nepal Earthquake.

    Science.gov (United States)

    Basu, Moumita; Ghosh, Saptarshi; Jana, Arnab; Bandyopadhyay, Somprakash; Singh, Ravikant

    2017-12-01

    The objective of this study was to explore a log of WhatsApp messages exchanged among members of the health care group Doctors For You (DFY) while they were providing medical relief in the aftermath of the Nepal earthquake in April 2015. Our motivation was to identify medical resource requirements during a disaster in order to help government agencies and other responding organizations to be better prepared in any upcoming disaster. A large set of WhatsApp (WhatsApp Inc, Mountain View, CA) messages exchanged among DFY members during the Nepal earthquake was collected and analyzed to identify the medical resource requirements during different phases of relief operations. The study revealed detailed phase-wise requirements for various types of medical resources, including medicines, medical equipment, and medical personnel. The data also reflected some of the problems faced by the medical relief workers in the earthquake-affected region. The insights from this study may help not only the Nepalese government, but also authorities in other earthquake-prone regions of the world to better prepare for similar disasters in the future. Moreover, real-time analysis of such online data during a disaster would aid decision-makers in dynamically formulating resource-mapping strategies. (Disaster Med Public Health Preparedness. 2017;11:652-655).

  4. Prediction of site specific ground motion for large earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1990-01-01

    In this paper, we apply the semi-empirical synthesis method by IRIKURA (1983, 1986) to the estimation of site specific ground motion using accelerograms observed at Kumatori in Osaka prefecture. Target earthquakes used here are a comparatively distant earthquake (Δ=95 km, M=5.6) caused by the YAMASAKI fault and a near earthquake (Δ=27 km, M=5.6). The results obtained are as follows. 1) The accelerograms from the distant earthquake (M=5.6) are synthesized using the aftershock records (M=4.3) for 1983 YAMASAKI fault earthquake whose source parameters have been obtained by other authors from the hypocentral distribution of the aftershocks. The resultant synthetic motions show a good agreement with the observed ones. 2) The synthesis for a near earthquake (M=5.6, we call this target earthquake) are made using a small earthquake which occurred in the neighborhood of the target earthquake. Here, we apply two methods for giving the parameters for synthesis. One method is to use the parameters of YAMASAKI fault earthquake which has the same magnitude as the target earthquake, and the other is to use the parameters obtained from several existing empirical formulas. The resultant synthetic motion with the former parameters shows a good agreement with the observed one, but that with the latter does not. 3) We estimate the source parameters from the source spectra of several earthquakes which have been observed in this site. Consequently we find that the small earthquakes (M<4) as Green's functions should be carefully used because the stress drops are not constant. 4) We propose that we should designate not only the magnitudes but also seismic moments of the target earthquake and the small earthquake. (J.P.N.)

  5. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  6. Investigating landslides caused by earthquakes - A historical review

    Science.gov (United States)

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  7. Post-Earthquake Reconstruction — in Context of Housing

    Science.gov (United States)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  8. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    Science.gov (United States)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  9. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  10. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  11. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  12. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    Science.gov (United States)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  13. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  14. Earthquake risk assessment of Alexandria, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  15. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    Science.gov (United States)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  16. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    Science.gov (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  17. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  18. Prediction of spectral acceleration response ordinates based on PGA attenuation

    Science.gov (United States)

    Graizer, V.; Kalkan, E.

    2009-01-01

    Developed herein is a new peak ground acceleration (PGA)-based predictive model for 5% damped pseudospectral acceleration (SA) ordinates of free-field horizontal component of ground motion from shallow-crustal earthquakes. The predictive model of ground motion spectral shape (i.e., normalized spectrum) is generated as a continuous function of few parameters. The proposed model eliminates the classical exhausted matrix of estimator coefficients, and provides significant ease in its implementation. It is structured on the Next Generation Attenuation (NGA) database with a number of additions from recent Californian events including 2003 San Simeon and 2004 Parkfield earthquakes. A unique feature of the model is its new functional form explicitly integrating PGA as a scaling factor. The spectral shape model is parameterized within an approximation function using moment magnitude, closest distance to the fault (fault distance) and VS30 (average shear-wave velocity in the upper 30 m) as independent variables. Mean values of its estimator coefficients were computed by fitting an approximation function to spectral shape of each record using robust nonlinear optimization. Proposed spectral shape model is independent of the PGA attenuation, allowing utilization of various PGA attenuation relations to estimate the response spectrum of earthquake recordings.

  19. The use of radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Ramola, R.C.; Singh, M.; Sandhu, A.S.; Singh, S.; Virk, H.S.

    1990-01-01

    Radon monitoring for earthquake prediction is part of an integral approach since the discovery of coherent and time anomalous radon concentrations prior to, during and after the 1966 Tashkent earthquake. In this paper some studies of groundwater and soil gas radon content in relation to earthquake activities are reviewed. Laboratory experiments and the development of groundwater and soil gas radon monitoring systems are described. In addition, radon monitoring studies conducted at the Guru Nanak Dev University Campus since 1986 are presented in detail. During these studies some anomalous changes in radon concentration were recorded before earthquakes occurred in the region. The anomalous radon increases are independent of meteorological conditions and appear to be caused by strain changes, which precede the earthquake. Anomalous changes in radon concentration before an earthquake suggest that radon monitoring can serve as an additional technique in the earthquake prediction programme in India. (author)

  20. Modeling fast and slow earthquakes at various scales.

    Science.gov (United States)

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  1. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    Science.gov (United States)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  2. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  3. Low cost earthquake resistant ferrocement small house

    International Nuclear Information System (INIS)

    Saleem, M.A.; Ashraf, M.; Ashraf, M.

    2008-01-01

    The greatest humanitarian challenge faced even today after one year of Kashmir Hazara earthquake is that of providing shelter. Currently on the globe one in seven people live in a slum or refugee camp. The earthquake of October 2005 resulted in a great loss of life and property. This research work is mainly focused on developing a design of small size, low cost and earthquake resistant house. Ferrocement panels are recommended as the main structural elements with lightweight truss roofing system. Earthquake resistance is ensured by analyzing the structure on ETABS for a seismic activity of zone 4. The behavior of structure is found satisfactory under the earthquake loading. An estimate of cost is also presented which shows that it is an economical solution. (author)

  4. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  5. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  6. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  7. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  8. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  9. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  10. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  11. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  12. Earthquake-triggered landslides in southwest China

    OpenAIRE

    X. L. Chen; Q. Zhou; H. Ran; R. Dong

    2012-01-01

    Southwest China is located in the southeastern margin of the Tibetan Plateau and it is a region of high seismic activity. Historically, strong earthquakes that occurred here usually generated lots of landslides and brought destructive damages. This paper introduces several earthquake-triggered landslide events in this region and describes their characteristics. Also, the historical data of earthquakes with a magnitude of 7.0 or greater, having occurred in this region, is col...

  13. Marmara Island earthquakes, of 1265 and 1935; Turkey

    Directory of Open Access Journals (Sweden)

    Y. Altınok

    2006-01-01

    Full Text Available The long-term seismicity of the Marmara Sea region in northwestern Turkey is relatively well-recorded. Some large and some of the smaller events are clearly associated with fault zones known to be seismically active, which have distinct morphological expressions and have generated damaging earthquakes before and later. Some less common and moderate size earthquakes have occurred in the vicinity of the Marmara Islands in the west Marmara Sea. This paper presents an extended summary of the most important earthquakes that have occurred in 1265 and 1935 and have since been known as the Marmara Island earthquakes. The informative data and the approaches used have therefore the potential of documenting earthquake ruptures of fault segments and may extend the records kept on earthquakes far before known history, rock falls and abnormal sea waves observed during these events, thus improving hazard evaluations and the fundamental understanding of the process of an earthquake.

  14. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  15. Distribution of incremental static stress caused by earthquakes

    Directory of Open Access Journals (Sweden)

    Y. Y. Kagan

    1994-01-01

    Full Text Available Theoretical calculations, simulations and measurements of rotation of earthquake focal mechanisms suggest that the stress in earthquake focal zones follows the Cauchy distribution which is one of the stable probability distributions (with the value of the exponent α equal to 1. We review the properties of the stable distributions and show that the Cauchy distribution is expected to approximate the stress caused by earthquakes occurring over geologically long intervals of a fault zone development. However, the stress caused by recent earthquakes recorded in instrumental catalogues, should follow symmetric stable distributions with the value of α significantly less than one. This is explained by a fractal distribution of earthquake hypocentres: the dimension of a hypocentre set, ��, is close to zero for short-term earthquake catalogues and asymptotically approaches 2¼ for long-time intervals. We use the Harvard catalogue of seismic moment tensor solutions to investigate the distribution of incremental static stress caused by earthquakes. The stress measured in the focal zone of each event is approximated by stable distributions. In agreement with theoretical considerations, the exponent value of the distribution approaches zero as the time span of an earthquake catalogue (ΔT decreases. For large stress values α increases. We surmise that it is caused by the δ increase for small inter-earthquake distances due to location errors.

  16. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  17. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  18. Ecological implications of extreme events: footprints of the 2010 earthquake along the Chilean coast.

    Directory of Open Access Journals (Sweden)

    Eduardo Jaramillo

    Full Text Available Deciphering ecological effects of major catastrophic events such as earthquakes, tsunamis, volcanic eruptions, storms and fires, requires rapid interdisciplinary efforts often hampered by a lack of pre-event data. Using results of intertidal surveys conducted shortly before and immediately after Chile's 2010 M(w 8.8 earthquake along the entire rupture zone (ca. 34-38°S, we provide the first quantification of earthquake and tsunami effects on sandy beach ecosystems. Our study incorporated anthropogenic coastal development as a key design factor. Ecological responses of beach ecosystems were strongly affected by the magnitude of land-level change. Subsidence along the northern rupture segment combined with tsunami-associated disturbance and drowned beaches. In contrast, along the co-seismically uplifted southern rupture, beaches widened and flattened increasing habitat availability. Post-event changes in abundance and distribution of mobile intertidal invertebrates were not uniform, varying with land-level change, tsunami height and coastal development. On beaches where subsidence occurred, intertidal zones and their associated species disappeared. On some beaches, uplift of rocky sub-tidal substrate eliminated low intertidal sand beach habitat for ecologically important species. On others, unexpected interactions of uplift with man-made coastal armouring included restoration of upper and mid-intertidal habitat seaward of armouring followed by rapid colonization of mobile crustaceans typical of these zones formerly excluded by constraints imposed by the armouring structures. Responses of coastal ecosystems to major earthquakes appear to vary strongly with land-level change, the mobility of the biota and shore type. Our results show that interactions of extreme events with human-altered shorelines can produce surprising ecological outcomes, and suggest these complex responses to landscape alteration can leave lasting footprints in coastal

  19. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    Science.gov (United States)

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  20. Small discussion of electromagnetic wave anomalies preceding earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    Six brief pieces on various aspects of electromagnetic wave anomalies are presented. They cover: earthquake electromagnetic emanations; the use of magnetic induction information for earthquake forecasting; electromagnetic pulse emissions as pre-earthquake indicators; the use of magnetic sensors to determine medium-wavelength field strength for earthquake prediction purposes; magnetic deviation indicators inside reinforced-concrete buildings; and a discussion of the general physical principles involved.

  1. Ionospheric precursors to large earthquakes: A case study of the 2011 Japanese Tohoku Earthquake

    Science.gov (United States)

    Carter, B. A.; Kellerman, A. C.; Kane, T. A.; Dyson, P. L.; Norman, R.; Zhang, K.

    2013-09-01

    Researchers have reported ionospheric electron distribution abnormalities, such as electron density enhancements and/or depletions, that they claimed were related to forthcoming earthquakes. In this study, the Tohoku earthquake is examined using ionosonde data to establish whether any otherwise unexplained ionospheric anomalies were detected in the days and hours prior to the event. As the choices for the ionospheric baseline are generally different between previous works, three separate baselines for the peak plasma frequency of the F2 layer, foF2, are employed here; the running 30-day median (commonly used in other works), the International Reference Ionosphere (IRI) model and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIE-GCM). It is demonstrated that the classification of an ionospheric perturbation is heavily reliant on the baseline used, with the 30-day median, the IRI and the TIE-GCM generally underestimating, approximately describing and overestimating the measured foF2, respectively, in the 1-month period leading up to the earthquake. A detailed analysis of the ionospheric variability in the 3 days before the earthquake is then undertaken, where a simultaneous increase in foF2 and the Es layer peak plasma frequency, foEs, relative to the 30-day median was observed within 1 h before the earthquake. A statistical search for similar simultaneous foF2 and foEs increases in 6 years of data revealed that this feature has been observed on many other occasions without related seismic activity. Therefore, it is concluded that one cannot confidently use this type of ionospheric perturbation to predict an impending earthquake. It is suggested that in order to achieve significant progress in our understanding of seismo-ionospheric coupling, better account must be taken of other known sources of ionospheric variability in addition to solar and geomagnetic activity, such as the thermospheric coupling.

  2. Investigating Landslides Caused by Earthquakes A Historical Review

    Science.gov (United States)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  3. Aseismic blocks and destructive earthquakes in the Aegean

    Science.gov (United States)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  4. Tsunami evacuation plans for future megathrust earthquakes in Padang, Indonesia, considering stochastic earthquake scenarios

    Directory of Open Access Journals (Sweden)

    A. Muhammad

    2017-12-01

    Full Text Available This study develops tsunami evacuation plans in Padang, Indonesia, using a stochastic tsunami simulation method. The stochastic results are based on multiple earthquake scenarios for different magnitudes (Mw 8.5, 8.75, and 9.0 that reflect asperity characteristics of the 1797 historical event in the same region. The generation of the earthquake scenarios involves probabilistic models of earthquake source parameters and stochastic synthesis of earthquake slip distributions. In total, 300 source models are generated to produce comprehensive tsunami evacuation plans in Padang. The tsunami hazard assessment results show that Padang may face significant tsunamis causing the maximum tsunami inundation height and depth of 15 and 10 m, respectively. A comprehensive tsunami evacuation plan – including horizontal evacuation area maps, assessment of temporary shelters considering the impact due to ground shaking and tsunami, and integrated horizontal–vertical evacuation time maps – has been developed based on the stochastic tsunami simulation results. The developed evacuation plans highlight that comprehensive mitigation policies can be produced from the stochastic tsunami simulation for future tsunamigenic events.

  5. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  6. A Decade of Giant Earthquakes - What does it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Terry C. Jr. [Los Alamos National Laboratory

    2012-07-16

    On December 26, 2004 the largest earthquake since 1964 occurred near Ache, Indonesia. The magnitude 9.2 earthquake and subsequent tsunami killed a quarter of million people; it also marked the being of a period of extraordinary seismicity. Since the Ache earthquake there have been 16 magnitude 8 earthquakes globally, including 2 this last April. For the 100 years previous to 2004 there was an average of 1 magnitude 8 earthquake every 2.2 years; since 2004 there has been 2 per year. Since magnitude 8 earthquakes dominate global seismic energy release, this period of seismicity has seismologist rethinking what they understand about plate tectonics and the connectivity between giant earthquakes. This talk will explore this remarkable period of time and its possible implications.

  7. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  8. Relationship of heat and cold to earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.

    1980-06-26

    An analysis of 54 earthquakes of magnitude 7 and above, including 13 of magnitude 8 and above, between 780 BC and the present, shows that the vast majority of them fell in the four major cool periods during this time span, or on the boundaries of these periods. Between 1800 and 1876, four periods of earthquake activity in China can be recognized, and these tend to correspond to relatively cold periods over that time span. An analysis of earthquakes of magnitude 6 or above over the period 1951 to 1965 gives the following results: earthquakes in north and southwest China tended to occur when the preceding year had an above-average annual temperature and winter temperature; in the northeast they tended to occur in a year after a year with an above-average winter temperature; in the northwest there was also a connection with a preceding warm winter, but to a less pronounced degree. The few earthquakes in South China seemed to follow cold winters. Both the Tangshan and Yongshan Pass earthquakes were preceded by unusually warm years and relatively high winter temperatures.

  9. Studies of the subsurface effects of earthquakes

    International Nuclear Information System (INIS)

    Marine, I.W.

    1980-01-01

    As part of the National Terminal Waste Storage Program, the Savannah River Laboratory is conducting a series of studies on the subsurface effects of earthquakes. This report summarizes three subcontracted studies. (1) Earthquake damage to underground facilities: the purpose of this study was to document damage and nondamage caused by earthquakes to tunnels and shallow underground openings; to mines and other deep openings; and to wells, shafts, and other vertical facilities. (2) Earthquake related displacement fields near underground facilities: the study included an analysis of block motion, an analysis of the dependence of displacement on the orientation and distance of joints from the earthquake source, and displacement related to distance and depth near a causative fault as a result of various shapes, depths, and senses of movement on the causative fault. (3) Numerical simulation of earthquake effects on tunnels for generic nuclear waste repositories: the objective of this study was to use numerical modeling to determine under what conditions seismic waves might cause instability of an underground opening or create fracturing that would increase the permeability of the rock mass

  10. Earthquake risk assessment of building structures

    International Nuclear Information System (INIS)

    Ellingwood, Bruce R.

    2001-01-01

    During the past two decades, probabilistic risk analysis tools have been applied to assess the performance of new and existing building structural systems. Structural design and evaluation of buildings and other facilities with regard to their ability to withstand the effects of earthquakes requires special considerations that are not normally a part of such evaluations for other occupancy, service and environmental loads. This paper reviews some of these special considerations, specifically as they pertain to probability-based codified design and reliability-based condition assessment of existing buildings. Difficulties experienced in implementing probability-based limit states design criteria for earthquake are summarized. Comparisons of predicted and observed building damage highlight the limitations of using current deterministic approaches for post-earthquake building condition assessment. The importance of inherent randomness and modeling uncertainty in forecasting building performance is examined through a building fragility assessment of a steel frame with welded connections that was damaged during the Northridge Earthquake of 1994. The prospects for future improvements in earthquake-resistant design procedures based on a more rational probability-based treatment of uncertainty are examined

  11. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    Science.gov (United States)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  12. Soil structure interactions of eastern U.S. type earthquakes

    International Nuclear Information System (INIS)

    Chang Chen; Serhan, S.

    1991-01-01

    Two types of earthquakes have occurred in the eastern US in the past. One of them was the infrequent major events such as the 1811-1812 New Madrid Earthquakes, or the 1886 Charleston Earthquake. The other type was the frequent shallow earthquakes with high frequency, short duration and high accelerations. Two eastern US nuclear power plants, V.C Summer and Perry, went through extensive licensing effort to obtain fuel load licenses after this type of earthquake was recorded on sites and exceeded the design bases beyond 10 hertz region. This paper discusses the soil-structure interactions of the latter type of earthquakes

  13. Has El Salvador Fault Zone produced M ≥ 7.0 earthquakes? The 1719 El Salvador earthquake

    Science.gov (United States)

    Canora, C.; Martínez-Díaz, J.; Álvarez-Gómez, J.; Villamor, P.; Ínsua-Arévalo, J.; Alonso-Henar, J.; Capote, R.

    2013-05-01

    Historically, large earthquakes, Mw ≥ 7.0, in the Εl Salvador area have been attributed to activity in the Cocos-Caribbean subduction zone. Τhis is correct for most of the earthquakes of magnitude greater than 6.5. However, recent paleoseismic evidence points to the existence of large earthquakes associated with rupture of the Εl Salvador Fault Ζone, an Ε-W oriented strike slip fault system that extends for 150 km through central Εl Salvador. Τo calibrate our results from paleoseismic studies, we have analyzed the historical seismicity of the area. In particular, we suggest that the 1719 earthquake can be associated with paleoseismic activity evidenced in the Εl Salvador Fault Ζone. Α reinterpreted isoseismal map for this event suggests that the damage reported could have been a consequence of the rupture of Εl Salvador Fault Ζone, rather than rupture of the subduction zone. Τhe isoseismal is not different to other upper crustal earthquakes in similar tectonovolcanic environments. We thus challenge the traditional assumption that only the subduction zone is capable of generating earthquakes of magnitude greater than 7.0 in this region. Τhis result has broad implications for future risk management in the region. Τhe potential occurrence of strong ground motion, significantly higher and closer to the Salvadorian populations that those assumed to date, must be considered in seismic hazard assessment studies in this area.

  14. Earthquake prediction the ory and its relation to precursors

    International Nuclear Information System (INIS)

    Negarestani, A.; Setayeshi, S.; Ghannadi-Maragheh, M.; Akasheh, B.

    2001-01-01

    Since we don't have enough knowledge about the Physics of earthquakes. therefore. the study of seismic precursors plays an important role in earthquake prediction. Earthquake prediction is a science which discusses about precursory phenomena during seismogenic process, and then investigates the correlation and association among them and the intrinsic relation between precursors and the seismogenic process. ar the end judges comprehensively the seismic status and finally makes earthquake prediction. There are two ways for predicting earthquake prediction. The first is to study the physics of seismogenic process and to determine the parameters in the process based on the source theories and the second way is to use seismic precursors. In this paper the theory of earthquake is reviewed. We also study theory of earthquake using models of earthquake origin, the relation between seismogenic process and various accompanying precursory phenomena. The earthquake prediction is divided into three categories: long-term, medium-term and short-term. We study seismic anomalous behavior. electric field, crustal deformation, gravity. magnetism of earth. change of groundwater variation. groundwater geochemistry and change of Radon gas emission. Finally, it is concluded the there is a correlation between Radon gas emission and earthquake phenomena. Meanwhile, there are some samples from actual processing in this area

  15. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  16. The 2010 Chile Earthquake: Rapid Assessments of Tsunami

    OpenAIRE

    Michelini, A.; Lauciani, V.; Selvaggi, G.; Lomax, A.

    2010-01-01

    After an earthquake underwater, rapid real-time assessment of earthquake parameters is important for emergency response related to infrastructure damage and, perhaps more exigently, for issuing warnings of the possibility of an impending tsunami. Since 2005, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) has worked on the rapid quantification of earthquake magnitude and tsunami potential, especially for the Mediterranean area. This work includes quantification of earthquake size fr...

  17. Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  18. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  19. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  20. Land-Ocean-Atmospheric Coupling Associated with Earthquakes

    Science.gov (United States)

    Prasad, A. K.; Singh, R. P.; Kumar, S.; Cervone, G.; Kafatos, M.; Zlotnicki, J.

    2007-12-01

    Earthquakes are well known to occur along the plate boundaries and also on the stable shield. The recent studies have shown existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes. We have carried out detailed analysis of multi sensor data (optical and microwave remote) to show existence of strong coupling between land-ocean-atmospheric parameters associated with the earthquakes with focal depth up to 30 km and magnitude greater than 5.5. Complimentary nature of various land, ocean and atmospheric parameters will be demonstrated in getting an early warning information about an impending earthquake.

  1. a Collaborative Cyberinfrastructure for Earthquake Seismology

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  2. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  3. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  4. Time-Reversal Study of the Hemet (CA) Tremor Source

    Science.gov (United States)

    Larmat, C. S.; Johnson, P. A.; Guyer, R. A.

    2010-12-01

    Since its first observation by Nadeau & Dolenc (2005) and Gomberg et al. (2008), tremor along the San Andreas fault system is thought to be a probe into the frictional state of the deep part of the fault (e.g. Shelly et al., 2007). Tremor is associated with slow, otherwise deep, aseismic slip events that may be triggered by faint signals such as passing waves from remote earthquakes or solid Earth tides.Well resolved tremor source location is key to constrain frictional models of the fault. However, tremor source location is challenging because of the high-frequency and highly-scattered nature of tremor signal characterized by the lack of isolated phase arrivals. Time Reversal (TR) methods are emerging as a useful tool for location. The unique requirement is a good velocity model for the different time-reversed phases to arrive coherently onto the source point. We present results of location for a tremor source near the town of Hemet, CA, which was triggered by the 2002 M 7.9 Denali Fault earthquake (Gomberg et al., 2008) and by the 2009 M 6.9 Gulf of California earthquake. We performed TR in a volume model of 88 (N-S) x 70 (W-E) x 60 km (Z) using the full-wave 3D wave-propagation package SPECFEM3D (Komatitsch et al., 2002). The results for the 2009 episode indicate a deep source (at about 22km) which is about 4km SW the fault surface scarp. We perform STA/SLA and correlation analysis in order to have independent confirmation of the Hemet tremor source. We gratefully acknowledge the support of the U. S. Department of Energy through the LANL/LDRD Program for this work.

  5. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  6. Earthquake Preparedness Among Japanese Hemodialysis Patients in Prefectures Heavily Damaged by the 2011 Great East Japan Earthquake.

    Science.gov (United States)

    Sugisawa, Hidehiro; Shimizu, Yumiko; Kumagai, Tamaki; Sugisaki, Hiroaki; Ohira, Seiji; Shinoda, Toshio

    2017-08-01

    The purpose of this study was to explore the factors related to earthquake preparedness in Japanese hemodialysis patients. We focused on three aspects of the related factors: health condition factors, social factors, and the experience of disasters. A mail survey of all the members of the Japan Association of Kidney Disease Patients in three Japanese prefectures (N = 4085) was conducted in March, 2013. We obtained 1841 valid responses for analysis. The health factors covered were: activities of daily living (ADL), mental distress, primary renal diseases, and the duration of dialysis. The social factors were: socioeconomic status, family structure, informational social support, and the provision of information regarding earthquake preparedness from dialysis facilities. The results show that the average percentage of participants that had met each criterion of earthquake preparedness in 2013 was 53%. Hemodialysis patients without disabled ADL, without mental distress, and requiring longer periods of dialysis, were likely to meet more of the earthquake preparedness criteria. Hemodialysis patients who had received informational social support from family or friends, had lived with spouse and children in comparison to living alone, and had obtained information regarding earthquake preparedness from dialysis facilities, were also likely to meet more of the earthquake preparedness criteria. © 2017 International Society for Apheresis, Japanese Society for Apheresis, and Japanese Society for Dialysis Therapy.

  7. Combining multiple earthquake models in real time for earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  8. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    Science.gov (United States)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  9. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  10. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  11. Salient Features of the 2015 Gorkha, Nepal Earthquake in Relation to Earthquake Cycle and Dynamic Rupture Models

    Science.gov (United States)

    Ampuero, J. P.; Meng, L.; Hough, S. E.; Martin, S. S.; Asimaki, D.

    2015-12-01

    Two salient features of the 2015 Gorkha, Nepal, earthquake provide new opportunities to evaluate models of earthquake cycle and dynamic rupture. The Gorkha earthquake broke only partially across the seismogenic depth of the Main Himalayan Thrust: its slip was confined in a narrow depth range near the bottom of the locked zone. As indicated by the belt of background seismicity and decades of geodetic monitoring, this is an area of stress concentration induced by deep fault creep. Previous conceptual models attribute such intermediate-size events to rheological segmentation along-dip, including a fault segment with intermediate rheology in between the stable and unstable slip segments. We will present results from earthquake cycle models that, in contrast, highlight the role of stress loading concentration, rather than frictional segmentation. These models produce "super-cycles" comprising recurrent characteristic events interspersed by deep, smaller non-characteristic events of overall increasing magnitude. Because the non-characteristic events are an intrinsic component of the earthquake super-cycle, the notion of Coulomb triggering or time-advance of the "big one" is ill-defined. The high-frequency (HF) ground motions produced in Kathmandu by the Gorkha earthquake were weaker than expected for such a magnitude and such close distance to the rupture, as attested by strong motion recordings and by macroseismic data. Static slip reached close to Kathmandu but had a long rise time, consistent with control by the along-dip extent of the rupture. Moreover, the HF (1 Hz) radiation sources, imaged by teleseismic back-projection of multiple dense arrays calibrated by aftershock data, was deep and far from Kathmandu. We argue that HF rupture imaging provided a better predictor of shaking intensity than finite source inversion. The deep location of HF radiation can be attributed to rupture over heterogeneous initial stresses left by the background seismic activity

  12. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    Directory of Open Access Journals (Sweden)

    S. R. McNutt

    1996-06-01

    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  13. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro; Mai, Paul Martin; Yasuda, Tomohiro; Mori, Nobuhito

    2014-01-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  14. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2014-09-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  15. Enhanced Earthquake-Resistance on the High Level Radioactive Waste Canister

    International Nuclear Information System (INIS)

    Choi, Youngchul; Yoon, Chanhoon; Lee, Jeaowan; Kim, Jinsup; Choi, Heuijoo

    2014-01-01

    In this paper, the earthquake-resistance type buffer was developed with the method protecting safely about the earthquake. The main parameter having an effect on the earthquake-resistant performance was analyzed and the earthquake-proof type buffer material was designed. The shear analysis model was developed and the performance of the earthquake-resistance buffer material was evaluated. The dynamic behavior of the radioactive waste disposal canister was analyzed in case the earthquake was generated. In the case, the disposal canister gets the serious damage. In this paper, the earthquake-resistance buffer material was developed in order to prevent this damage. By putting the buffer in which the density is small between the canister and buffer, the earthquake-resistant performance was improved about 80%

  16. A Case Study of the Bam Earthquake to Establish a Pattern for Earthquake Management in Iran

    Directory of Open Access Journals (Sweden)

    Keramatollah Ziari

    2015-03-01

    Full Text Available The field of crisis management knowledge and expertise is associated with a wide range of fields. Knowledge-based crisis management is a combination of science, art and practice. Iran is an earthquake-prone country. Through years several earthquakes have happened in the country resulting in many human and financial losses. According to scientific standards, the first 24 hours following an earthquake is the most valuable time for saving victims. Yet in the case of Bam only 5% of the victims were rescued within the first 48 hours. The success of disaster management is evaluated in terms of programming, raising public participation, organizing and hiring manpower, and supervising the management process. In this study disaster management is divided into three stages in which different actions are required. The stages and actions are explained in detail. Moreover, features, effects, and losses of the earthquake are described.

  17. The survey and mapping of sand-boil landforms related to the Emilia 2012 earthquakes: preliminary results

    Directory of Open Access Journals (Sweden)

    Andrea Ninfo

    2012-10-01

    Full Text Available Sand boils, which are also known as sand blows or sand volcanoes, are among the most common superficial effects induced by high-magnitude earthquakes. These generally occur in or close to alluvial plains when a strong earthquake (M >5 strikes on a lens of saturated and unconsolidated sand deposits that are constrained between silt-clay layers [Ambraseys 1988, Carter and Seed 1988, Galli 2000, Tuttle 2001, Obermeier et al. 2005], where the sediments are converted into a fluid suspension. The liquefaction phenomena requires the presence of saturated and uncompacted sand, and a groundwater table near the ground surface. This geological–geomorphological setting is common and widespread for the Po Plain (Italy [Castiglioni et al. 1997]. The Po Plain (ca. 46,000 km2 represents 15% of the Italian territory. It hosts a population of about 20 million people (mean density of 450 people/km2 and many infrastructures. Thus, the Po Plain is an area of high vulnerability when considering the liquefaction potential in the case of a strong earthquake. Despite the potential, such phenomena are rarely observed in northern Italy [Cavallin et al. 1977, Galli 2000], because strong earthquakes are not frequent in this region; e.g., historical data report soil liquefaction near Ferrara in 1570 (M 5.3 and in Argenta 1624 (M 5.5 [Prestininzi and Romeo 2000, Galli 2000]. In the Emilia quakes of May 20 and 29, 2012, the most widespread coseismic effects were soil liquefaction and ground cracks, which occurred over wide areas in the Provinces of Modena, Ferrara, Bologna, Reggio Emilia and Mantova (Figure 1. […

  18. 75 FR 66388 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2010-10-28

    ..., including the multi-hazards demonstration project and earthquake early warning prototype development. The... DEPARTMENT OF THE INTERIOR U.S. Geological Survey [USGS-GX11GG009950000] Scientific Earthquake... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next...

  19. 75 FR 2159 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2010-01-14

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey Scientific Earthquake Studies Advisory Committee... Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting at the U.S. Geological... participation in the National Earthquake Hazards Reduction Program. The Committee will receive updates and...

  20. Earthquake geology of the Bulnay Fault (Mongolia)

    Science.gov (United States)

    Rizza, Magali; Ritz, Jean-Franciois; Prentice, Carol S.; Vassallo, Ricardo; Braucher, Regis; Larroque, Christophe; Arzhannikova, A.; Arzhanikov, S.; Mahan, Shannon; Massault, M.; Michelot, J-L.; Todbileg, M.

    2015-01-01

    The Bulnay earthquake of July 23, 1905 (Mw 8.3-8.5), in north-central Mongolia, is one of the world's largest recorded intracontinental earthquakes and one of four great earthquakes that occurred in the region during the 20th century. The 375-km-long surface rupture of the left-lateral, strike-slip, N095°E trending Bulnay Fault associated with this earthquake is remarkable for its pronounced expression across the landscape and for the size of features produced by previous earthquakes. Our field observations suggest that in many areas the width and geometry of the rupture zone is the result of repeated earthquakes; however, in those areas where it is possible to determine that the geomorphic features are the result of the 1905 surface rupture alone, the size of the features produced by this single earthquake are singular in comparison to most other historical strike-slip surface ruptures worldwide. Along the 80 km stretch, between 97.18°E and 98.33°E, the fault zone is characterized by several meters width and the mean left-lateral 1905 offset is 8.9 ± 0.6 m with two measured cumulative offsets that are twice the 1905 slip. These observations suggest that the displacement produced during the penultimate event was similar to the 1905 slip. Morphotectonic analyses carried out at three sites along the eastern part of the Bulnay fault, allow us to estimate a mean horizontal slip rate of 3.1 ± 1.7 mm/yr over the Late Pleistocene-Holocene period. In parallel, paleoseismological investigations show evidence for two earthquakes prior to the 1905 event with recurrence intervals of ~2700-4000 years.

  1. Method for forecasting an earthquake from precursor signals

    International Nuclear Information System (INIS)

    Farnworth, D.F.

    1996-01-01

    A method for forecasting an earthquake from precursor signals by employing characteristic first electromagnetic signals, second, seismically induced electromagnetic signals, seismically induced mechanical signals, and infrasonic acoustic signals which have been observed to precede an earthquake. From a first electromagnetic signal, a magnitude, depth beneath the surface of the earth, distance, latitude, longitude, and first and second forecasts of the time of occurrence of the impending earthquake may be derived. From a second, seismically induced electromagnetic signal and the mechanical signal, third and fourth forecasts of the time of occurrence of an impending earthquake determined from the analysis above, a magnitude, depth beneath the surface of the earth and fourth and fifth forecasts of the time of occurrence of the impending earthquake may be derived. The forecasts of time available from the above analyses range from up to five weeks to substantially within one hour in advance of the earthquake. (author)

  2. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  3. Incorporating human-triggered earthquake risks into energy and water policies

    Science.gov (United States)

    Klose, C. D.; Seeber, L.; Jacob, K. H.

    2010-12-01

    A comprehensive understanding of earthquake risks in urbanized regions requires an accurate assessment of both urban vulnerabilities and hazards from earthquakes, including ones whose timing might be affected by human activities. Socioeconomic risks associated with human-triggered earthquakes are often misconstrued and receive little scientific, legal, and public attention. Worldwide, more than 200 damaging earthquakes, associated with industrialization and urbanization, were documented since the 20th century. Geomechanical pollution due to large-scale geoengineering activities can advance the clock of earthquakes, trigger new seismic events or even shot down natural background seismicity. Activities include mining, hydrocarbon production, fluid injections, water reservoir impoundments and deep-well geothermal energy production. This type of geohazard has impacts on human security on a regional and national level. Some planned or considered future engineering projects raise particularly strong concerns about triggered earthquakes, such as for instance, sequestration of carbon dioxide by injecting it deep underground and large-scale natural gas production in the Marcellus shale in the Appalacian basin. Worldwide examples of earthquakes are discussed, including their associated losses of human life and monetary losses (e.g., 1989 Newcastle and Volkershausen earthquakes, 2001 Killari earthquake, 2006 Basel earthquake, 2010 Wenchuan earthquake). An overview is given on global statistics of human-triggered earthquakes, including depths and time delay of triggering. Lastly, strategies are described, including risk mitigation measures such as urban planning adaptations and seismic hazard mapping.

  4. Oklahoma’s recent earthquakes and saltwater disposal

    Science.gov (United States)

    Walsh, F. Rall; Zoback, Mark D.

    2015-01-01

    Over the past 5 years, parts of Oklahoma have experienced marked increases in the number of small- to moderate-sized earthquakes. In three study areas that encompass the vast majority of the recent seismicity, we show that the increases in seismicity follow 5- to 10-fold increases in the rates of saltwater disposal. Adjacent areas where there has been relatively little saltwater disposal have had comparatively few recent earthquakes. In the areas of seismic activity, the saltwater disposal principally comes from “produced” water, saline pore water that is coproduced with oil and then injected into deeper sedimentary formations. These formations appear to be in hydraulic communication with potentially active faults in crystalline basement, where nearly all the earthquakes are occurring. Although most of the recent earthquakes have posed little danger to the public, the possibility of triggering damaging earthquakes on potentially active basement faults cannot be discounted. PMID:26601200

  5. Implications of fault constitutive properties for earthquake prediction.

    Science.gov (United States)

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  6. Statistical aspects and risks of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  7. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  8. Cyclic characteristics of earthquake time histories

    International Nuclear Information System (INIS)

    Hall, J.R. Jr; Shukla, D.K.; Kissenpfennig, J.F.

    1977-01-01

    From an engineering standpoint, an earthquake record may be characterized by a number of parameters, one of which is its 'cyclic characteristics'. The cyclic characteristics are most significant in fatigue analysis of structures and liquefaction analysis of soils where, in addition to the peak motion, cyclic buildup is significant. Whereas duration peak amplitude and response spectra for earthquakes have been studied extensively, the cyclic characteristics of earthquake records have not received an equivalent attention. Present procedures to define the cyclic characteristics are generally based upon counting the number of peaks at various amplitude ranges on a record. This paper presents a computer approach which describes a time history by an amplitude envelope and a phase curve. Using Fast Fourier Transform Techniques, an earthquake time history is represented as a projection along the x-axis of a rotating vector-the length the vector is given by the amplitude spectra-and the angle between the vector and x-axis is given by the phase curve. Thus one cycle is completed when the vector makes a full rotation. Based upon Miner's cumulative damage concept, the computer code automatically combines the cycles of various amplitudes to obtain the equivalent number of cycles of a given amplitude. To illustrate the overall results, the cyclic characteristics of several real and synthetic earthquake time histories have been studied and are presented in the paper, with the conclusion that this procedure provides a physical interpretation of the cyclic characteristics of earthquakes. (Auth.)

  9. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  10. 78 FR 19004 - Scientific Earthquake Studies Advisory Committee

    Science.gov (United States)

    2013-03-28

    ... Hazards Program. Focus topics for this meeting include induced seismicity, earthquake early warning and... DEPARTMENT OF THE INTERIOR U.S. Geological Survey [GX13GG009950000] Scientific Earthquake Studies... Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next meeting...

  11. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  12. Tilt Precursors before Earthquakes on the San Andreas Fault, California.

    Science.gov (United States)

    Johnston, M J; Mortensen, C E

    1974-12-13

    An array of 14 biaxial shallow-borehole tiltmeters (at 1O(-7) radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (> 10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  13. The Differences in Source Dynamics Between Intermediate-Depth and Deep EARTHQUAKES:A Comparative Study Between the 2014 Rat Islands Intermediate-Depth Earthquake and the 2015 Bonin Islands Deep Earthquake

    Science.gov (United States)

    Twardzik, C.; Ji, C.

    2015-12-01

    It has been proposed that the mechanisms for intermediate-depth and deep earthquakes might be different. While previous extensive seismological studies suggested that such potential differences do not significantly affect the scaling relationships of earthquake parameters, there has been only a few investigations regarding their dynamic characteristics, especially for fracture energy. In this work, the 2014 Mw7.9 Rat Islands intermediate-depth (105 km) earthquake and the 2015 Mw7.8 Bonin Islands deep (680 km) earthquake are studied from two different perspectives. First, their kinematic rupture models are constrained using teleseismic body waves. Our analysis reveals that the Rat Islands earthquake breaks the entire cold core of the subducting slab defined as the depth of the 650oC isotherm. The inverted stress drop is 4 MPa, compatible to that of intra-plate earthquakes at shallow depths. On the other hand, the kinematic rupture model of the Bonin Islands earthquake, which occurred in a region lacking of seismicity for the past forty years, according to the GCMT catalog, exhibits an energetic rupture within a 35 km by 30 km slip patch and a high stress drop of 24 MPa. It is of interest to note that although complex rupture patterns are allowed to match the observations, the inverted slip distributions of these two earthquakes are simple enough to be approximated as the summation of a few circular/elliptical slip patches. Thus, we investigate subsequently their dynamic rupture models. We use a simple modelling approach in which we assume that the dynamic rupture propagation obeys a slip-weakening friction law, and we describe the distribution of stress and friction on the fault as a set of elliptical patches. We will constrain the three dynamic parameters that are yield stress, background stress prior to the rupture and slip weakening distance, as well as the shape of the elliptical patches directly from teleseismic body waves observations. The study would help us

  14. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  15. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  16. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC

    Science.gov (United States)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.

    2017-12-01

    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  17. Earthquake safety program at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Freeland, G.E.

    1985-01-01

    Within three minutes on the morning of January 24, 1980, an earthquake and three aftershocks, with Richter magnitudes of 5.8, 5.1, 4.0, and 4.2, respectively, struck the Livermore Valley. Two days later, a Richter magnitude 5.4 earthquake occurred, which had its epicenter about 4 miles northwest of the Lawrence Livermore National Laboratory (LLNL). Although no one at the Lab was seriously injured, these earthquakes caused considerable damage and disruption. Masonry and concrete structures cracked and broke, trailers shifted and fell off their pedestals, office ceilings and overhead lighting fell, and bookcases overturned. The Laboratory was suddenly immersed in a site-wide program of repairing earthquake-damaged facilities, and protecting our many employees and the surrounding community from future earthquakes. Over the past five years, LLNL has spent approximately $10 million on its earthquake restoration effort for repairs and upgrades. The discussion in this paper centers upon the earthquake damage that occurred, the clean-up and restoration efforts, the seismic review of LLNL facilities, our site-specific seismic design criteria, computer-floor upgrades, ceiling-system upgrades, unique building seismic upgrades, geologic and seismologic studies, and seismic instrumentation. 10 references

  18. A contrast study of the traumatic condition between the wounded in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake.

    Science.gov (United States)

    Ding, Sheng; Hu, Yonghe; Zhang, Zhongkui; Wang, Ting

    2015-01-01

    5.12 Wenchuan earthquake and 4.25 Nepal earthquake are of the similar magnitude, but the climate and geographic environment are totally different. Our team carried out medical rescue in both disasters, so we would like to compare the different traumatic conditions of the wounded in two earthquakes. The clinical data of the wounded respectively in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake rescued by Chengdu Military General Hospital were retrospectively analyzed. Then a contrast study between the wounded was conducted in terms of age, sex, injury mechanisms, traumatic conditions, complications and prognosis. Three days after 5.12 Wenchuan earthquake, 465 cases of the wounded were hospitalized in Chengdu Military General Hospital, including 245 males (52.7%) and 220 females (47.3%) with the average age of (47.6±22.7) years. Our team carried out humanitarian relief in Katmandu after 4.25 Nepal earthquake. Three days after this disaster, 71 cases were treated in our field hospital, including 37 males (52.1%) and 34 females (47.9%) with the mean age of (44.8±22.9) years. There was no obvious difference in sex and mean age between two groups, but the age distribution was a little different: there were more wounded people at the age over 60 years in 4.25 Nepal earthquake (pearthquake (pearthquake had a higher rate of bruise injury and crush injury (pearthquake had a higher rate of falling injury (pearthquake, 4.25 Nepal earthquake has a much higher incidence of limb fractures (pearthquakes of the similar magnitude can cause different injury mechanisms, traumatic conditions and complications in the wounded under different climate and geographic environment.When an earthquake occurs in a poor traffic area of high altitude and large temperature difference, early medical rescue, injury control and wounded evacuation as well as sufficient warmth retention and food supply are of vital significance.

  19. Earthquake free design of pipe lines

    International Nuclear Information System (INIS)

    Kurihara, Chizuko; Sakurai, Akio

    1974-01-01

    Long structures such as cooling sea water pipe lines of nuclear power plants have a wide range of extent along the ground surface, and are incurred by not only the inertia forces but also forces due to ground deformations or the seismic wave propagation during earthquakes. Since previous reports indicated the earthquake free design of underground pipe lines, it is discussed in this report on behaviors of pipe lines on the ground during earthquakes and is proposed the aseismic design of pipe lines considering the effects of both inertia forces and ground deformations. (author)

  20. Toward standardization of slow earthquake catalog -Development of database website-

    Science.gov (United States)

    Kano, M.; Aso, N.; Annoura, S.; Arai, R.; Ito, Y.; Kamaya, N.; Maury, J.; Nakamura, M.; Nishimura, T.; Obana, K.; Sugioka, H.; Takagi, R.; Takahashi, T.; Takeo, A.; Yamashita, Y.; Matsuzawa, T.; Ide, S.; Obara, K.

    2017-12-01

    Slow earthquakes have now been widely discovered in the world based on the recent development of geodetic and seismic observations. Many researchers detect a wide frequency range of slow earthquakes including low frequency tremors, low frequency earthquakes, very low frequency earthquakes and slow slip events by using various methods. Catalogs of the detected slow earthquakes are open to us in different formats by each referring paper or through a website (e.g., Wech 2010; Idehara et al. 2014). However, we need to download catalogs from different sources, to deal with unformatted catalogs and to understand the characteristics of different catalogs, which may be somewhat complex especially for those who are not familiar with slow earthquakes. In order to standardize slow earthquake catalogs and to make such a complicated work easier, Scientific Research on Innovative Areas "Science of Slow Earthquakes" has been developing a slow earthquake catalog website. In the website, we can plot locations of various slow earthquakes via the Google Maps by compiling a variety of slow earthquake catalogs including slow slip events. This enables us to clearly visualize spatial relations among slow earthquakes at a glance and to compare the regional activities of slow earthquakes or the locations of different catalogs. In addition, we can download catalogs in the unified format and refer the information on each catalog on the single website. Such standardization will make it more convenient for users to utilize the previous achievements and to promote research on slow earthquakes, which eventually leads to collaborations with researchers in various fields and further understanding of the mechanisms, environmental conditions, and underlying physics of slow earthquakes. Furthermore, we expect that the website has a leading role in the international standardization of slow earthquake catalogs. We report the overview of the website and the progress of construction. Acknowledgment: This

  1. Stress triggering and the Canterbury earthquake sequence

    Science.gov (United States)

    Steacy, Sandy; Jiménez, Abigail; Holden, Caroline

    2014-01-01

    The Canterbury earthquake sequence, which includes the devastating Christchurch event of 2011 February, has to date led to losses of around 40 billion NZ dollars. The location and severity of the earthquakes was a surprise to most inhabitants as the seismic hazard model was dominated by an expected Mw > 8 earthquake on the Alpine fault and an Mw 7.5 earthquake on the Porters Pass fault, 150 and 80 km to the west of Christchurch. The sequence to date has included an Mw = 7.1 earthquake and 3 Mw ≥ 5.9 events which migrated from west to east. Here we investigate whether the later events are consistent with stress triggering and whether a simple stress map produced shortly after the first earthquake would have accurately indicated the regions where the subsequent activity occurred. We find that 100 per cent of M > 5.5 earthquakes occurred in positive stress areas computed using a slip model for the first event that was available within 10 d of its occurrence. We further find that the stress changes at the starting points of major slip patches of post-Darfield main events are consistent with triggering although this is not always true at the hypocentral locations. Our results suggest that Coulomb stress changes contributed to the evolution of the Canterbury sequence and we note additional areas of increased stress in the Christchurch region and on the Porters Pass fault.

  2. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    Science.gov (United States)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  3. Public perceptions and acceptance of induced earthquakes related to energy development

    International Nuclear Information System (INIS)

    McComas, Katherine A.; Lu, Hang; Keranen, Katie M.; Furtney, Maria A.; Song, Hwansuck

    2016-01-01

    Growing awareness of the potential for some energy-related activities to induce earthquakes has created a need to understand how the public evaluates the risks of induced earthquakes versus the benefits of energy development. To address this need, this study presents a web survey that used a between-subjects factorial experimental design to explore the views of 325 U.S. adults, who were asked about their experiences with earthquakes; risk perceptions related to different causes of earthquakes (e.g., natural versus induced); and acceptability of earthquakes depending on the benefits, beneficiaries, and decision making process. The results found that participants had more negative feelings toward induced versus naturally occurring earthquakes. Although they judged no earthquake as “acceptable,” participants rated induced earthquakes significantly less acceptable than naturally occurring ones. Attributing the benefits to the provision of renewable energy or climate change mitigation did not increase induced earthquake acceptability, and no particular beneficiary made earthquakes more acceptable, although private companies as beneficiaries made earthquakes less acceptable. Finally, induced earthquake acceptability was significantly higher when people believed that people like them had a voice in the decision to implement the technology that caused the earthquake, underscoring the importance of public engagement in the development of energy technologies. - Highlights: • Human induced earthquakes were perceived as more negative than natural earthquakes. • Attributing benefits to renewable energy did not increase earthquake acceptability. • Acceptability was highest after a procedurally fair decision making process. • Acceptability was lowest following an expert-driven decision.

  4. Simulation analysis of earthquake response of nuclear power plant to the 2003 Miyagi-Oki earthquake

    International Nuclear Information System (INIS)

    Yoshihiro Ogata; Kiyoshi Hirotani; Masayuki Higuchi; Shingo Nakayama

    2005-01-01

    On May 26, 2003 an earthquake of magnitude scale 7.1 (Japan Meteorological Agency) occurred just offshore of Miyagi Prefecture. This was the largest earthquake ever experienced by the nuclear power plant of Tohoku Electric Power Co. in Onagawa (hereafter the Onagawa Nuclear Power Plant) during the 19 years since it had started operations in 1984. In this report, we review the vibration characteristics of the reactor building of the Onagawa Nuclear Power Plant Unit 1 based on acceleration records observed at the building, and give an account of a simulation analysis of the earthquake response carried out to ascertain the appropriateness of design procedure and a seismic safety of the building. (authors)

  5. Simulation of the earthquake-induced collapse of a school building in Turkey in 2011 Van Earthquake

    NARCIS (Netherlands)

    Bal, Ihsan Engin; Smyrou, Eleni

    2016-01-01

    Collapses of school or dormitory buildings experienced in recent earthquakes raise the issue of safety as a major challenge for decision makers. A school building is ‘just another structure’ technically speaking, however, the consequences of a collapse in an earthquake could lead to social reactions

  6. Fault Branching and Long-Term Earthquake Rupture Scenario for Strike-Slip Earthquake

    Science.gov (United States)

    Klinger, Y.; CHOI, J. H.; Vallage, A.

    2017-12-01

    Careful examination of surface rupture for large continental strike-slip earthquakes reveals that for the majority of earthquakes, at least one major branch is involved in the rupture pattern. Often, branching might be either related to the location of the epicenter or located toward the end of the rupture, and possibly related to the stopping of the rupture. In this work, we examine large continental earthquakes that show significant branches at different scales and for which ground surface rupture has been mapped in great details. In each case, rupture conditions are described, including dynamic parameters, past earthquakes history, and regional stress orientation, to see if the dynamic stress field would a priori favor branching. In one case we show that rupture propagation and branching are directly impacted by preexisting geological structures. These structures serve as pathways for the rupture attempting to propagate out of its shear plane. At larger scale, we show that in some cases, rupturing a branch might be systematic, hampering possibilities for the development of a larger seismic rupture. Long-term geomorphology hints at the existence of a strong asperity in the zone where the rupture branched off the main fault. There, no evidence of throughgoing rupture could be seen along the main fault, while the branch is well connected to the main fault. This set of observations suggests that for specific configurations, some rupture scenarios involving systematic branching are more likely than others.

  7. Earthquake location in island arcs

    Science.gov (United States)

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  8. Application of τc*Pd in earthquake early warning

    Science.gov (United States)

    Huang, Po-Lun; Lin, Ting-Li; Wu, Yih-Min

    2015-03-01

    Rapid assessment of damage potential and size of an earthquake at the station is highly demanded for onsite earthquake early warning. We study the application of τc*Pd for its estimation on the earthquake size using 123 events recorded by the borehole stations of KiK-net in Japan. The new type of earthquake size determined by τc*Pd is more related to the damage potential. We find that τc*Pd provides another parameter to measure the size of earthquake and the threshold to warn strong ground motion.

  9. Environmental Health assessment 200 Days after Earthquake-Affected Region in East Azerbaijan Earthquake, North-Western of Iran, 2012

    Directory of Open Access Journals (Sweden)

    Alihossein Zeinalzadeh

    2017-04-01

    Full Text Available Evaluating of health status and explore the challenges of health problems that threaten human life following disasters and major earthquakes providing windows of opportunities for health care providers in future planning of disasters. The main purpose of this report was to survey the environmental sanitation statues after 200 days of the affected populations in earthquakes of East Azerbaijan, northwestern of Iran, 2012. The survey was carried out in earthquake zones 200 days after the occurrence of the earthquake. A single stage cluster sampling from among 95 villages damaged in the earthquake of 2012 East Azerbaijan of three towns Ahar, Varzeghan and Heris were selected. The data were collected with questionnaire, site visits and evaluation of water and sanitation. In a twin Earthquake, East Azerbaijan province that 399 villages of Ahar, Varzeghan, Heris, Tabriz and Kaleibar cities were affected and 356 (89.2 % villages were destroyed between 30-100%.  Evaluation of water and sanitation infrastructure after 200 days, shown that only half of these villages consumed healthy water with high coverage and adequate. Half of the villages in 200 days after the earthquake were covered safe drinking water (treated drinking water. The bacteriological quality of drinking-water supply of the affected area was assessed in randomly collected 146 samples from this region and ten (6.8% reported as unsuitable. Solid waste management facilities in residents have not been acceptable that affect public health. Solid waste disposal was done by district residents (cooperation rural residents 68.4%, 36.8% and 76.3% in Ahar, Varzeghan and Heris, respectively. Overall, the impact of infectious and communicable diseases after Earthquake was reported 42.1% (16 villages in the Varzeghan. The lack of geographical view with a focus in mountainous and rural areas, partial support and dispersion of earthquake-stricken people in affected villages and lack of participatory need

  10. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  11. [Medium- and long-term health effects of the L'Aquila earthquake (Central Italy, 2009) and of other earthquakes in high-income Countries: a systematic review].

    Science.gov (United States)

    Ripoll Gallardo, Alba; Alesina, Marta; Pacelli, Barbara; Serrone, Dario; Iacutone, Giovanni; Faggiano, Fabrizio; Della Corte, Francesco; Allara, Elias

    2016-01-01

    to compare the methodological characteristics of the studies investigating the middle- and long-term health effects of the L'Aquila earthquake with the features of studies conducted after other earthquakes occurred in highincome Countries. a systematic comparison between the studies which evaluated the health effects of the L'Aquila earthquake (Central Italy, 6th April 2009) and those conducted after other earthquakes occurred in comparable settings. Medline, Scopus, and 6 sources of grey literature were systematically searched. Inclusion criteria comprised measurement of health outcomes at least one month after the earthquake, investigation of earthquakes occurred in high-income Countries, and presence of at least one temporal or geographical control group. out of 2,976 titles, 13 studies regarding the L'Aquila earthquake and 51 studies concerning other earthquakes were included. The L'Aquila and the Kobe/Hanshin- Awaji (Japan, 17th January 1995) earthquakes were the most investigated. Studies on the L'Aquila earthquake had a median sample size of 1,240 subjects, a median duration of 24 months, and used most frequently a cross sectional design (7/13). Studies on other earthquakes had a median sample size of 320 subjects, a median duration of 15 months, and used most frequently a time series design (19/51). the L'Aquila studies often focussed on mental health, while the earthquake effects on mortality, cardiovascular outcomes, and health systems were less frequently evaluated. A more intensive use of routine data could benefit future epidemiological surveillance in the aftermath of earthquakes.

  12. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  13. Electrostatically actuated resonant switches for earthquake detection

    KAUST Repository

    Ramini, Abdallah H.

    2013-04-01

    The modeling and design of electrostatically actuated resonant switches (EARS) for earthquake and seismic applications are presented. The basic concepts are based on operating an electrically actuated resonator close to instability bands of frequency, where it is forced to collapse (pull-in) if operated within these bands. By careful tuning, the resonator can be made to enter the instability zone upon the detection of the earthquake signal, thereby pulling-in as a switch. Such a switching action can be functionalized for useful functionalities, such as shutting off gas pipelines in the case of earthquakes, or can be used to activate a network of sensors for seismic activity recording in health monitoring applications. By placing a resonator on a printed circuit board (PCB) of a natural frequency close to that of the earthquake\\'s frequency, we show significant improvement on the detection limit of the EARS lowering it considerably to less than 60% of the EARS by itself without the PCB. © 2013 IEEE.

  14. Chile Earthquake: U.S. and International Response

    Science.gov (United States)

    2010-03-11

    most regions far from the epicenter did not experience any serious damage. A tsunami caused significant damage to the city of Hilo , Hawaii ...Tsunami Warning Center for Hawaii , Japan, and other regions bordering the Pacific Ocean that may have been vulnerable to a damaging tsunami, although...earthquake. Why the 1960 earthquake generated a tsunami that caused damage and fatalities in Hawaii , Japan, and the Philippines while the 2010 earthquake did

  15. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    Science.gov (United States)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  16. A Comparison of Geodetic and Geologic Rates Prior to Large Strike-Slip Earthquakes: A Diversity of Earthquake-Cycle Behaviors?

    Science.gov (United States)

    Dolan, James F.; Meade, Brendan J.

    2017-12-01

    Comparison of preevent geodetic and geologic rates in three large-magnitude (Mw = 7.6-7.9) strike-slip earthquakes reveals a wide range of behaviors. Specifically, geodetic rates of 26-28 mm/yr for the North Anatolian fault along the 1999 MW = 7.6 Izmit rupture are ˜40% faster than Holocene geologic rates. In contrast, geodetic rates of ˜6-8 mm/yr along the Denali fault prior to the 2002 MW = 7.9 Denali earthquake are only approximately half as fast as the latest Pleistocene-Holocene geologic rate of ˜12 mm/yr. In the third example where a sufficiently long pre-earthquake geodetic time series exists, the geodetic and geologic rates along the 2001 MW = 7.8 Kokoxili rupture on the Kunlun fault are approximately equal at ˜11 mm/yr. These results are not readily explicable with extant earthquake-cycle modeling, suggesting that they may instead be due to some combination of regional kinematic fault interactions, temporal variations in the strength of lithospheric-scale shear zones, and/or variations in local relative plate motion rate. Whatever the exact causes of these variable behaviors, these observations indicate that either the ratio of geodetic to geologic rates before an earthquake may not be diagnostic of the time to the next earthquake, as predicted by many rheologically based geodynamic models of earthquake-cycle behavior, or different behaviors characterize different fault systems in a manner that is not yet understood or predictable.

  17. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  18. An interdisciplinary approach for earthquake modelling and forecasting

    Science.gov (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  19. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    Science.gov (United States)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  20. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  1. Earthquake Safety Tips in the Classroom

    Science.gov (United States)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

    2014-12-01

    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  2. Shallow moonquakes - How they compare with earthquakes

    Science.gov (United States)

    Nakamura, Y.

    1980-01-01

    Of three types of moonquakes strong enough to be detectable at large distances - deep moonquakes, meteoroid impacts and shallow moonquakes - only shallow moonquakes are similar in nature to earthquakes. A comparison of various characteristics of moonquakes with those of earthquakes indeed shows a remarkable similarity between shallow moonquakes and intraplate earthquakes: (1) their occurrences are not controlled by tides; (2) they appear to occur in locations where there is evidence of structural weaknesses; (3) the relative abundances of small and large quakes (b-values) are similar, suggesting similar mechanisms; and (4) even the levels of activity may be close. The shallow moonquakes may be quite comparable in nature to intraplate earthquakes, and they may be of similar origin.

  3. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Science.gov (United States)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  4. Evaluating real-time air-quality data as earthquake indicator

    International Nuclear Information System (INIS)

    Hsu, Shih-Chieh; Huang, Yi-Tang; Huang, Jr-Chung; Tu, Jien-Yi; Engling, Guenter; Lin, Chuan-Yao; Lin, Fei-Jan; Huang, Chao-Hao

    2010-01-01

    A catastrophic earthquake, namely the 921-earthquake, occurred with a magnitude of M L = 7.3 in Taiwan on September 21, 1999, causing severe disaster. The evaluation of real-time air-quality data, obtained by the Taiwan Environmental Protection Administration (EPA), revealed a staggering increase in ambient SO 2 concentrations by more than one order of magnitude across the island several hours prior to the earthquake, particularly at background stations. The abrupt increase in SO 2 concentrations likely resulted from seismic-triggered degassing instead of air pollution. An additional case of a large earthquake (M L = 6.8), occurring on March 31, 2002, was examined to confirm our observations of significantly enhanced SO 2 concentrations in ambient air prior to large earthquakes. The coincidence between large earthquakes and increases in trace gases during the pre-quake period (several hours) indicates the potential of employing air-quality monitoring data to forecast catastrophic earthquakes.

  5. Research in historical earthquakes in the Korean peninsula and its circumferential regions

    Institute of Scientific and Technical Information of China (English)

    翟文杰; 吴戈; 韩绍欣

    2004-01-01

    @@ The historical earthquake data is one of the important foundations for seismic monitoring, earthquake fore-cast and seismic safety evaluation. However, the recognition of earthquake is limited by the scientific and techno-logical level. Therefore, the earthquake can only be described using perfect earthquake catalogue after the seismo-graph is invented. Before this time, the earthquake parameters were determined according to the earthquake disas-ter on the surface and the written records in history, and the earthquake level was measured using earthquake in-tensity.

  6. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Science.gov (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  7. Steep-dip seismic imaging of the shallow San Andreas fault near Parkfield.

    Science.gov (United States)

    Hole, J A; Catchings, R D; St Clair, K C; Rymer, M J; Okaya, D A; Carney, B J

    2001-11-16

    Seismic reflection and refraction images illuminate the San Andreas Fault to a depth of 1 kilometer. The prestack depth-migrated reflection image contains near-vertical reflections aligned with the active fault trace. The fault is vertical in the upper 0.5 kilometer, then dips about 70 degrees to the southwest to at least 1 kilometer subsurface. This dip reconciles the difference between the computed locations of earthquakes and the surface fault trace. The seismic velocity cross section shows strong lateral variations. Relatively low velocity (10 to 30%), high electrical conductivity, and low density indicate a 1-kilometer-wide vertical wedge of porous sediment or fractured rock immediately southwest of the active fault trace.

  8. Earthquakes trigger the loss of groundwater biodiversity

    Science.gov (United States)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  9. The Northern Rupture of the 1762 Arakan Meghathrust Earthquake and other Potential Earthquake Sources in Bangladesh.

    Science.gov (United States)

    Akhter, S. H.; Seeber, L.; Steckler, M. S.

    2015-12-01

    Bangladesh is one of the most densely populated countries in the world. It occupies a major part of the Bengal Basin, which contains the Ganges-Brahmaputra Delta (GBD), the largest and one of the most active of world deltas, and is located along the Alpine-Himalayan seismic belt. As such it is vulnerable to many natural hazards, especially earthquakes. The country sits at the junction of three tectonic plates - Indian, Eurasian, and the Burma 'sliver' of the Sunda plate. These form two boundaries where plates converge- the India-Eurasia plate boundary to the north forming the Himalaya Arc and the India-Burma plate boundary to the east forming the Indo-Burma Arc. The India-Burma plate boundary is exceptionally wide because collision with the GBD feeds an exception amount of sediment into the subduction zone. Thus the Himalayan continent collision orogeny along with its syntaxes to the N and NE of Bangladesh and the Burma Arc subduction boundary surround Bangladesh on two sides with active faults of regional scale, raising the potential for high-magnitude earthquakes. In recent years Bangladesh has experienced minor to moderate earthquakes. Historical records show that major and great earthquakes have ravaged the country and the neighboring region several times over the last 450 years. Field observations of Tertiary structures along the Chittagong-Teknaf coast reveal that the rupture of 1762 Arakan megathrust earthquake extended as far north as the Sitakund anticline to the north of the city of Chittagong. This earthquake brought changes to the landscape, uplifting the Teknaf peninsula and St. Martin's Island by about 2-2.5 m, and activated two mud volcanos along the axis of the Sitakund anticline, where large tabular blocks of exotic crystalline limestone, were tectonically transported from a deep-seated formation along with the eruptive mud. Vast area of the coast including inland areas east of the lower Meghna River were inundated. More than 500 peoples died near

  10. From Multi-Sensors Observations Towards Cross-Disciplinary Study of Pre-Earthquake Signals. What have We Learned from the Tohoku Earthquake?

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S.; Papadopoulos, G.; Kunitsyn, V.; Nesterov, I.; Hayakawa, M.; Mogi, K.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The lessons we have learned from the Great Tohoku EQ (Japan, 2011) how this knowledge will affect our future observation and analysis is the main focus of this presentation.We present multi-sensors observations and multidisciplinary research in our investigation of phenomena preceding major earthquakes. These observations revealed the existence of atmospheric and ionospheric phenomena occurring prior to theM9.0 Tohoku earthquake of March 11, 2011, which indicates s new evidence of a distinct coupling between the lithosphere and atmosphere/ionosphere, as related to underlying tectonic activity. Similar results have been reported before the catastrophic events in Chile (M8.8, 2010), Italy (M6.3, 2009) and Sumatra (M9.3, 2004). For the Tohoku earthquake, our analysis shows a synergy between several independent observations characterizing the state of the lithosphere /atmosphere coupling several days before the onset of the earthquakes, namely: (i) Foreshock sequence change (rate, space and time); (ii) Outgoing Long wave Radiation (OLR) measured at the top of the atmosphere; and (iii) Anomalous variations of ionospheric parameters revealed by multi-sensors observations. We are presenting a cross-disciplinary analysis of the observed pre-earthquake anomalies and will discuss current research in the detection of these signals in Japan. We expect that our analysis will shed light on the underlying physics of pre-earthquake signals associated with some of the largest earthquake events

  11. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  12. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    Science.gov (United States)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  13. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  14. The 2007 Mentawai earthquake sequence on the Sumatra megathrust

    Science.gov (United States)

    Konca, A.; Avouac, J.; Sladen, A.; Meltzner, A. J.; Kositsky, A. P.; Sieh, K.; Fang, P.; Li, Z.; Galetzka, J.; Genrich, J.; Chlieh, M.; Natawidjaja, D. H.; Bock, Y.; Fielding, E. J.; Helmberger, D. V.

    2008-12-01

    The Sumatra Megathrust has recently produced a flurry of large interplate earthquakes starting with the giant Mw 9.15, Aceh earthquake of 2004. All of these earthquakes occurred within the area monitored by the Sumatra Geodetic Array (SuGAr), which provided exceptional records of near-field co-seismic and postseismic ground displacements. The most recent of these major earthquakes, an Mw 8.4 earthquake and an Mw 7.9 earthquake twelve hours later, occurred in the Mentawai islands area where devastating historical earthquakes had happened in 1797 and 1833. The 2007 earthquake sequence provides an exceptional opportunity to understand the variability of the earthquakes along megathrusts and their relation to interseismic coupling. The InSAR, GPS and teleseismic modeling shows that 2007 earthquakes ruptured a fraction of the strongly coupled Mentawai patch of the megathrust, which is also only a fraction of the 1833 rupture area. It also released a much smaller moment than the one released in 1833, or than the deficit of moment that has accumulated since. Both earthquakes of 2007 consist of 2 sub-events which are 50 to 100 km apart from each other. On the other hand, the northernmost slip patch of 8.4 and southern slip patch of 7.9 earthquakes abut each other, but they ruptured 12 hours apart. Sunda megathrust earthquakes of recent years include a rupture of a strongly coupled patch that closely mimics a prior rupture of that patch and which is well correlated with the interseismic coupling pattern (Nias-Simeulue section), as well as a rupture sequence of a strongly coupled patch that differs substantially in the details from its most recent predecessors (Mentawai section). We conclude that (1) seismic asperities are probably persistent features which arise form heterogeneous strain build up in the interseismic period; and (2) the same portion of a megathrust can rupture in different ways depending on whether asperities break as isolated events or cooperate to produce

  15. Earthquake response observation of isolated buildings

    International Nuclear Information System (INIS)

    Harada, O.; Kawai, N.; Ishii, T.; Sawada, Y.; Shiojiri, H.; Mazda, T.

    1989-01-01

    Base isolation system is expected to be a technology for a rational design of FBR plant. In order to apply this system to important structures, accumulation of verification data is necessary. From this point of view, the vibration test and the earthquake response observation of the actual isolated building using laminated rubber bearings and elasto-plastic steel dampers were conducted for the purpose of investigating its dynamic behavior and of proving the reliability of the base isolation system. Since September in 1986, more than thirty earthquakes have been observed. This paper presents the results of the earthquake response observation

  16. Future Developments for the Earthquake Early Warning System following the 2011 off the Pacific Coast of Tohoku Earthquake

    Science.gov (United States)

    Yamada, M.; Mori, J. J.

    2011-12-01

    The 2011 off the Pacific Coast of Tohoku Earthquake (Mw9.0) caused significant damage over a large area of northeastern Honshu. An earthquake early warning was issued to the public in the Tohoku region about 8 seconds after the first P-arrival, which is 31 seconds after the origin time. There was no 'blind zone', and warnings were received at all locations before S-wave arrivals, since the earthquake was fairly far offshore. Although the early warning message was properly reported in Tohoku region which was the most severely affected area, a message was not sent to the more distant Tokyo region because the intensity was underestimated. . This underestimation was because the magnitude determination in the first few seconds was relatively small (Mj8.1)., and there was no consideration of a finite fault with a long length. Another significant issue is that warnings were sometimes not properly provided for aftershocks. Immediately following the earthquake, the waveforms of some large aftershocks were contaminated by long-period surface waves from the mainshock, which made it difficult to pick P-wave arrivals. Also, correctly distinguishing and locating later aftershocks was sometimes difficult, when multiple events occurred within a short period of time. This masinhock begins with relatively small moment release for the first 10 s . Since the amplitude of the initial waveforms is small, most methods that use amplitudes and periods of the P-wave (e.g. Wu and Kanamori, 2005) cannot correctly determine the size of the4 earthquake in the first several seconds. The current JMA system uses the peak displacement amplitude for the magnitude estimation, and the magnitude saturated at about M8 1 minute after the first P-wave arrival. . Magnitudes of smaller earthquakes can be correctly identified from the first few seconds of P- or S-wave arrivals, but this M9 event cannot be characterized in such a short time. The only way to correctly characterize the size of the Tohoku

  17. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  18. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan,Earthquake

    Directory of Open Access Journals (Sweden)

    Chien-Chih Chen

    2006-01-01

    Full Text Available Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the M 7.3 1999 Chi-Chi, Taiwan, earthquake. We show that a previously proposed forecast method that is based on evaluating changes in seismic intensity on a regional basis is superior to a forecast based only on the magnitude of seismic intensity in the same region. Our results confirm earlier suggestions that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous activation or quiescence, and that signatures of these processes can be detected in seismicity data using appropriate methods.

  19. Radon anomalies prior to earthquakes (1). Review of previous studies

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    The relationship between radon anomalies and earthquakes has been studied for more than 30 years. However, most of the studies dealt with radon in soil gas or in groundwater. Before the 1995 Hyogoken-Nanbu earthquake, an anomalous increase of atmospheric radon was observed at Kobe Pharmaceutical University. The increase was well fitted with a mathematical model related to earthquake fault dynamics. This paper reports the significance of this observation, reviewing previous studies on radon anomaly before earthquakes. Groundwater/soil radon measurements for earthquake prediction began in 1970's in Japan as well as foreign countries. One of the most famous studies in Japan is groundwater radon anomaly before the 1978 Izu-Oshima-kinkai earthquake. We have recognized the significance of radon in earthquake prediction research, but recently its limitation was also pointed out. Some researchers are looking for a better indicator for precursors; simultaneous measurements of radon and other gases are new trials in recent studies. Contrary to soil/groundwater radon, we have not paid much attention to atmospheric radon before earthquakes. However, it might be possible to detect precursors in atmospheric radon before a large earthquake. In the next issues, we will discuss the details of the anomalous atmospheric radon data observed before the Hyogoken-Nanbu earthquake. (author)

  20. Coulomb Failure Stress Accumulation in Nepal After the 2015 Mw 7.8 Gorkha Earthquake: Testing Earthquake Triggering Hypothesis and Evaluating Seismic Hazards

    Science.gov (United States)

    Xiong, N.; Niu, F.

    2017-12-01

    A Mw 7.8 earthquake struck Gorkha, Nepal, on April 5, 2015, resulting in more than 8000 deaths and 3.5 million homeless. The earthquake initiated 70km west of Kathmandu and propagated eastward, rupturing an area of approximately 150km by 60km in size. However, the earthquake failed to fully rupture the locked fault beneath the Himalaya, suggesting that the region south of Kathmandu and west of the current rupture are still locked and a much more powerful earthquake might occur in future. Therefore, the seismic hazard of the unruptured region is of great concern. In this study, we investigated the Coulomb failure stress (CFS) accumulation on the unruptured fault transferred by the Gorkha earthquake and some nearby historical great earthquakes. First, we calculated the co-seismic CFS changes of the Gorkha earthquake on the nodal planes of 16 large aftershocks to quantitatively examine whether they were brought closer to failure by the mainshock. It is shown that at least 12 of the 16 aftershocks were encouraged by an increase of CFS of 0.1-3 MPa. The correspondence between the distribution of off-fault aftershocks and the increased CFS pattern also validates the applicability of the earthquake triggering hypothesis in the thrust regime of Nepal. With the validation as confidence, we calculated the co-seismic CFS change on the locked region imparted by the Gorkha earthquake and historical great earthquakes. A newly proposed ramp-flat-ramp-flat fault geometry model was employed, and the source parameters of historical earthquakes were computed with the empirical scaling relationship. A broad region south of the Kathmandu and west of the current rupture were shown to be positively stressed with CFS change roughly ranging between 0.01 and 0.5 MPa. The maximum of CFS increase (>1MPa) was found in the updip segment south of the current rupture, implying a high seismic hazard. Since the locked region may be additionally stressed by the post-seismic relaxation of the lower

  1. Spatial Distribution of earthquakes off the coast of Fukushima Two Years after the M9 Earthquake: the Southern Area of the 2011 Tohoku Earthquake Rupture Zone

    Science.gov (United States)

    Yamada, T.; Nakahigashi, K.; Shinohara, M.; Mochizuki, K.; Shiobara, H.

    2014-12-01

    Huge earthquakes cause vastly stress field change around the rupture zones, and many aftershocks and other related geophysical phenomenon such as geodetic movements have been observed. It is important to figure out the time-spacious distribution during the relaxation process for understanding the giant earthquake cycle. In this study, we pick up the southern rupture area of the 2011 Tohoku earthquake (M9.0). The seismicity rate keeps still high compared with that before the 2011 earthquake. Many studies using ocean bottom seismometers (OBSs) have been doing since soon after the 2011 Tohoku earthquake in order to obtain aftershock activity precisely. Here we show one of the studies at off the coast of Fukushima which is located on the southern part of the rupture area caused by the 2011 Tohoku earthquake. We deployed 4 broadband type OBSs (BBOBSs) and 12 short-period type OBSs (SOBS) in August 2012. Other 4 BBOBSs attached with absolute pressure gauges and 20 SOBSs were added in November 2012. We recovered 36 OBSs including 8 BBOBSs in November 2013. We selected 1,000 events in the vicinity of the OBS network based on a hypocenter catalog published by the Japan Meteorological Agency, and extracted the data after time corrections caused by each internal clock. Each P and S wave arrival times, P wave polarity and maximum amplitude were picked manually on a computer display. We assumed one dimensional velocity structure based on the result from an active source experiment across our network, and applied time corrections every station for removing ambiguity of the assumed structure. Then we adopted a maximum-likelihood estimation technique and calculated the hypocenters. The results show that intensive activity near the Japan Trench can be seen, while there was a quiet seismic zone between the trench zone and landward high activity zone.

  2. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  3. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  4. Network similarity and statistical analysis of earthquake seismic data

    OpenAIRE

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2016-01-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We cal...

  5. Guidelines for nuclear plant response to an earthquake

    International Nuclear Information System (INIS)

    1989-12-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. The objectives of the earthquake response procedures are to determine (1) the immediate effects of an earthquake on the physical condition of the nuclear power plant, (2) if shutdown of the plant is appropriate based on the observed damage to the plant or because the OBE has been exceeded, and (3) the readiness of the plant to resume operation following shutdown due to an earthquake. Readiness of a nuclear power plant to restart is determined on the basis of visual inspections of nuclear plant equipment and structures, and the successful completion of surveillance tests which demonstrate that the limiting conditions for operation as defined in the plant Technical Specifications are met. The guidelines are based on information obtained from a review of earthquake response procedures from numerous US and foreign nuclear power plants, interviews with nuclear plant operations personnel, and a review of reports of damage to industrial equipment and structures in actual earthquakes. 7 refs., 4 figs., 4 tabs

  6. High resolution measurement of earthquake impacts on rock slope stability and damage using pre- and post-earthquake terrestrial laser scans

    Science.gov (United States)

    Hutchinson, Lauren; Stead, Doug; Rosser, Nick

    2017-04-01

    Understanding the behaviour of rock slopes in response to earthquake shaking is instrumental in response and relief efforts following large earthquakes as well as to ongoing risk management in earthquake affected areas. Assessment of the effects of seismic shaking on rock slope kinematics requires detailed surveys of the pre- and post-earthquake condition of the slope; however, at present, there is a lack of high resolution monitoring data from pre- and post-earthquake to facilitate characterization of seismically induced slope damage and validate models used to back-analyze rock slope behaviour during and following earthquake shaking. Therefore, there is a need for additional research where pre- and post- earthquake monitoring data is available. This paper presents the results of a direct comparison between terrestrial laser scans (TLS) collected in 2014, the year prior to the 2015 earthquake sequence, with that collected 18 months after the earthquakes and two monsoon cycles. The two datasets were collected using Riegl VZ-1000 and VZ-4000 full waveform laser scanners with high resolution (c. 0.1 m point spacing as a minimum). The scans cover the full landslide affected slope from the toe to the crest. The slope is located in Sindhupalchok District, Central Nepal which experienced some of the highest co-seismic and post-seismic landslide intensities across Nepal due to the proximity to the epicenters (<20 km) of both of the main aftershocks on April 26, 2015 (M 6.7) and May 12, 2015 (M7.3). During the 2015 earthquakes and subsequent 2015 and 2016 monsoons, the slope experienced rockfall and debris flows which are evident in satellite imagery and field photographs. Fracturing of the rock mass associated with the seismic shaking is also evident at scales not accessible through satellite and field observations. The results of change detection between the TLS datasets with an emphasis on quantification of seismically-induced slope damage is presented. Patterns in the

  7. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  8. Earthquake Damping Device for Steel Frame

    Science.gov (United States)

    Zamri Ramli, Mohd; Delfy, Dezoura; Adnan, Azlan; Torman, Zaida

    2018-04-01

    Structures such as buildings, bridges and towers are prone to collapse when natural phenomena like earthquake occurred. Therefore, many design codes are reviewed and new technologies are introduced to resist earthquake energy especially on building to avoid collapse. The tuned mass damper is one of the earthquake reduction products introduced on structures to minimise the earthquake effect. This study aims to analyse the effectiveness of tuned mass damper by experimental works and finite element modelling. The comparisons are made between these two models under harmonic excitation. Based on the result, it is proven that installing tuned mass damper will reduce the dynamic response of the frame but only in several input frequencies. At the highest input frequency applied, the tuned mass damper failed to reduce the responses. In conclusion, in order to use a proper design of damper, detailed analysis must be carried out to have sufficient design based on the location of the structures with specific ground accelerations.

  9. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  10. GIS BASED SYSTEM FOR POST-EARTHQUAKE CRISIS MANAGMENT USING CELLULAR NETWORK

    OpenAIRE

    Raeesi, M.; Sadeghi-Niaraki, A.

    2013-01-01

    Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the d...

  11. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  12. Magnitudes and frequencies of earthquakes in relation to seismic risk

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1989-01-01

    Estimating the frequencies of occurrence of earthquakes of different magnitudes on a regional basis is an important task in estimating seismic risk at a construction site. Analysis of global earthquake data provides an insight into the magnitudes frequency relationship in a statistical manner. It turns out that, whereas a linear relationship between the logarithm of earthquake occurrence rates and the corresponding earthquake magnitudes fits well in the magnitude range between 5 and 7, a second degree polynomial in M, the earthquake magnitude provides a better description of the frequencies of earthquakes in a much wider range of magnitudes. It may be possible to adopt magnitude frequency relation for regions, for which adequate earthquake data are not available, to carry out seismic risk calculations. (author). 32 refs., 8 tabs., 7 figs

  13. The 1985 México earthquake The 1985 México earthquake

    Directory of Open Access Journals (Sweden)

    Moreno Murillo Juan Manuel

    1995-10-01

    Full Text Available

    This paper includes a bibliographic review with the description of the various aspects about the (Ms = 8.1 Michoacan, Mexico earthquake, which comprised of three events. The main shock of the September 19, 1985 earthquake occurred on Thursday at 7h. 17m. 46.6s. local time in Mexico City, and had (Ms = 8.1. The focus of the event was a depth of approximately 18 km. A second shock occurred on Friday evening 21 September at 7h. 38m. p.m. local time. The last aftershock occurred on 30 April of 1986 (Ms = 7.0. A prior event occurred to the September 1985 earthquake, occurred on 28 May, 1985 (mb = 5.2 and is described too. This event, was a terrible natural disaster for that country, at least 9,500 people were killed, about 30,000 were injured, more that 100,000 were left homeless and severe damage occurred in many parts of Mexico City and several states of central Mexico. According to some sources, It is estimated that the earthquake seriously affected an area of approximately 825,000 square kilometers. This paper describes a summary of the global tectonic setting, genesis and location of the epicenter, an interpretation of the source mechanism and a analyses at these results from some stations that recorded this earthquake and at the same time, a comparison between the two largest earthquake of 1985. Moreover, this paper describes the principal damage resulting and a description of effects from tsunami produced from earthquake. The 1985 Mexico earthquake occurred as a result of slipping in the subduction process between the Cocos and American plates. This was a shallow interplate thrust type event which occurred in the intersection of the Orozco fracture with the Middle American trench.

  14. Global variations of large megathrust earthquake rupture characteristics

    Science.gov (United States)

    Kanamori, Hiroo

    2018-01-01

    Despite the surge of great earthquakes along subduction zones over the last decade and advances in observations and analysis techniques, it remains unclear whether earthquake complexity is primarily controlled by persistent fault properties or by dynamics of the failure process. We introduce the radiated energy enhancement factor (REEF), given by the ratio of an event’s directly measured radiated energy to the calculated minimum radiated energy for a source with the same seismic moment and duration, to quantify the rupture complexity. The REEF measurements for 119 large [moment magnitude (Mw) 7.0 to 9.2] megathrust earthquakes distributed globally show marked systematic regional patterns, suggesting that the rupture complexity is strongly influenced by persistent geological factors. We characterize this as the existence of smooth and rough rupture patches with varying interpatch separation, along with failure dynamics producing triggering interactions that augment the regional influences on large events. We present an improved asperity scenario incorporating both effects and categorize global subduction zones and great earthquakes based on their REEF values and slip patterns. Giant earthquakes rupturing over several hundred kilometers can occur in regions with low-REEF patches and small interpatch spacing, such as for the 1960 Chile, 1964 Alaska, and 2011 Tohoku earthquakes, or in regions with high-REEF patches and large interpatch spacing as in the case for the 2004 Sumatra and 1906 Ecuador-Colombia earthquakes. Thus, combining seismic magnitude Mw and REEF, we provide a quantitative framework to better represent the span of rupture characteristics of great earthquakes and to understand global seismicity. PMID:29750186

  15. Is earthquake rate in south Iceland modified by seasonal loading?

    Science.gov (United States)

    Jonsson, S.; Aoki, Y.; Drouin, V.

    2017-12-01

    Several temporarily varying processes have the potential of modifying the rate of earthquakes in the south Iceland seismic zone, one of the two most active seismic zones in Iceland. These include solid earth tides, seasonal meteorological effects and influence from passing weather systems, and variations in snow and glacier loads. In this study we investigate the influence these processes may have on crustal stresses and stressing rates in the seismic zone and assess whether they appear to be influencing the earthquake rate. While historical earthquakes in the south Iceland have preferentially occurred in early summer, this tendency is less clear for small earthquakes. The local earthquake catalogue (going back to 1991, magnitude of completeness M6+ earthquakes, which occurred in June 2000 and May 2008. Standard Reasenberg earthquake declustering or more involved model independent stochastic declustering algorithms are not capable of fully eliminating the aftershocks from the catalogue. We therefore inspected the catalogue for the time period before 2000 and it shows limited seasonal tendency in earthquake occurrence. Our preliminary results show no clear correlation between earthquake rates and short-term stressing variations induced from solid earth tides or passing storms. Seasonal meteorological effects also appear to be too small to influence the earthquake activity. Snow and glacier load variations induce significant vertical motions in the area with peak loading occurring in Spring (April-May) and maximum unloading in Fall (Sept.-Oct.). Early summer occurrence of historical earthquakes therefore correlates with early unloading rather than with the peak unloading or unloading rate, which appears to indicate limited influence of this seasonal process on the earthquake activity.

  16. Correlating precursory declines in groundwater radon with earthquake magnitude.

    Science.gov (United States)

    Kuo, T

    2014-01-01

    Both studies at the Antung hot spring in eastern Taiwan and at the Paihe spring in southern Taiwan confirm that groundwater radon can be a consistent tracer for strain changes in the crust preceding an earthquake when observed in a low-porosity fractured aquifer surrounded by a ductile formation. Recurrent anomalous declines in groundwater radon were observed at the Antung D1 monitoring well in eastern Taiwan prior to the five earthquakes of magnitude (Mw ): 6.8, 6.1, 5.9, 5.4, and 5.0 that occurred on December 10, 2003; April 1, 2006; April 15, 2006; February 17, 2008; and July 12, 2011, respectively. For earthquakes occurring on the longitudinal valley fault in eastern Taiwan, the observed radon minima decrease as the earthquake magnitude increases. The above correlation has been proven to be useful for early warning local large earthquakes. In southern Taiwan, radon anomalous declines prior to the 2010 Mw 6.3 Jiasian, 2012 Mw 5.9 Wutai, and 2012 ML 5.4 Kaohsiung earthquakes were also recorded at the Paihe spring. For earthquakes occurring on different faults in southern Taiwan, the correlation between the observed radon minima and the earthquake magnitude is not yet possible. © 2013, National Ground Water Association.

  17. Earthquake outlook for the San Francisco Bay region 2014–2043

    Science.gov (United States)

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  18. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    Science.gov (United States)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  19. Roaming earthquakes in China highlight midcontinental hazards

    Science.gov (United States)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  20. Multiple Spectral Ratio Analyses Reveal Earthquake Source Spectra of Small Earthquakes and Moment Magnitudes of Microearthquakes

    Science.gov (United States)

    Uchide, T.; Imanishi, K.

    2016-12-01

    Spectral studies for macroscopic earthquake source parameters are helpful for characterizing earthquake rupture process and hence understanding earthquake source physics and fault properties. Those studies require us mute wave propagation path and site effects in spectra of seismograms to accentuate source effect. We have recently developed the multiple spectral ratio method [Uchide and Imanishi, BSSA, 2016] employing many empirical Green's function (EGF) events to reduce errors from the choice of EGF events. This method helps us estimate source spectra more accurately as well as moment ratios among reference and EGF events, which are useful to constrain the seismic moment of microearthquakes. First, we focus on earthquake source spectra. The source spectra have generally been thought to obey the omega-square model with single corner-frequency. However recent studies imply the existence of another corner frequency for some earthquakes. We analyzed small shallow inland earthquakes (3.5 multiple spectral ratio analyses. For 20000 microearthquakes in Fukushima Hamadori and northern Ibaraki prefecture area, we found that the JMA magnitudes (Mj) based on displacement or velocity amplitude are systematically below Mw. The slope of the Mj-Mw relation is 0.5 for Mj 5. We propose a fitting curve for the obtained relationship as Mw = (1/2)Mj + (1/2)(Mjγ + Mcorγ)1/γ+ c, where Mcor is a corner magnitude, γ determines the sharpness of the corner, and c denotes an offset. We obtained Mcor = 4.1, γ = 5.6, and c = -0.47 to fit the observation. The parameters are useful for characterizing the Mj-Mw relationship. This non-linear relationship affects the b-value of the Gutenberg-Richter law. Quantitative discussions on b-values are affected by the definition of magnitude to use.

  1. The Klamath Falls, Oregon, earthquakes on September 20, 1993

    Science.gov (United States)

    Brantley, S.R.

    1993-01-01

    The strongest earthquake to strike Oregon in more than 50 yrs struck the southern part of the State on September 20, 1993. These shocks, a magnitude 5.9 earthquake at 8:28pm and a magnitude 6.0 earthquake at 10:45pm, were the opening salvo in a swarm of earthquakes that continued for more than three months. During this period, several thousand aftershocks, many strong enough to be felt, were recorded by seismographs.

  2. The limits of earthquake early warning: Timeliness of ground motion estimates

    OpenAIRE

    Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.

    2018-01-01

    The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions aroun...

  3. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  4. When it happens again: impact of future San Francisco Bay area earthquakes

    Science.gov (United States)

    Zoback, M.; Boatwright, J.; Kornfield, L.; Scawthorn, C.; Rojahn, C.

    2005-12-01

    Hayward Fault, an Association of Bay Area Governments study concluded that the two quakes would produce similar numbers of displaced households, between 155,000 to 160,000 in the 9-county Bay area (www.abag.ca.gov/bayarea/eqmaps/eqhouse.html). A recent vulnerability analysis for the city and county of San Francisco by the Applied Technology Council found that a repeat of a 1906-size quake and subsequent fires could completely destroy 35% of the privately-owned buildings (~45,000 buildings out of a total of ~122,000). These losses in the City are comparable to those in 1906. This CAPSS (Community Action Plan for Seismic Safety) analysis was conducted using HAZUS modified for detailed ground motion inputs and a building inventory at neighborhood scale and of unprecedented detail. It concluded the most serious impact of future quakes in San Francisco would be damage to and loss of residential housing, due largely to construction style and age of housing stock. More than 122,000 displaced households were estimated for a repeat of 1906 (out of a total of ~329,000; 42%). Defining the scope of the earthquake risk and exposure is a necessary step in developing long-term mitigation plans including criteria or the evaluation and repair of damaged buildings, incentives, and education. As Katrina taught us, recovery from disasters producing massive loss of housing and personal property is a formidable challenge.

  5. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    Science.gov (United States)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same

  6. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    Science.gov (United States)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  7. U.S. Geological Survey (USGS) Earthquake Web Applications

    Science.gov (United States)

    Fee, J.; Martinez, E.

    2015-12-01

    USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/

  8. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  9. VLF/LF Radio Sounding of Ionospheric Perturbations Associated with Earthquakes

    Directory of Open Access Journals (Sweden)

    Masashi Hayakawa

    2007-07-01

    Full Text Available It is recently recognized that the ionosphere is very sensitive to seismic effects,and the detection of ionospheric perturbations associated with earthquakes, seems to bevery promising for short-term earthquake prediction. We have proposed a possible use ofVLF/LF (very low frequency (3-30 kHz /low frequency (30-300 kHz radio sounding ofthe seismo-ionospheric perturbations. A brief history of the use of subionospheric VLF/LFpropagation for the short-term earthquake prediction is given, followed by a significantfinding of ionospheric perturbation for the Kobe earthquake in 1995. After showingprevious VLF/LF results, we present the latest VLF/LF findings; One is the statisticalcorrelation of the ionospheric perturbation with earthquakes and the second is a case studyfor the Sumatra earthquake in December, 2004, indicating the spatical scale and dynamicsof ionospheric perturbation for this earthquake.

  10. Earthquake Loss Scenarios in the Himalayas

    Science.gov (United States)

    Wyss, M.; Gupta, S.; Rosset, P.; Chamlagain, D.

    2017-12-01

    We estimate quantitatively that in repeats of the 1555 and 1505 great Himalayan earthquakes the fatalities may range from 51K to 549K, the injured from 157K to 1,700K and the strongly affected population (Intensity≥VI) from 15 to 75 million, depending on the details of the assumed earthquake parameters. For up-dip ruptures in the stressed segments of the M7.8 Gorkha 2015, the M7.9 Subansiri 1947 and the M7.8 Kangra 1905 earthquakes, we estimate 62K, 100K and 200K fatalities, respectively. The numbers of strongly affected people we estimate as 8, 12, 33 million, in these cases respectively. These loss calculations are based on verifications of the QLARM algorithms and data set in the cases of the M7.8 Gorkha 2015, the M7.8 Kashmir 2005, the M6.6 Chamoli 1999, the M6.8 Uttarkashi 1991 and the M7.8 Kangra 1905 earthquakes. The requirement of verification that was fulfilled in these test cases was that the reported intensity field and the fatality count had to match approximately, using the known parameters of the earthquakes. The apparent attenuation factor was a free parameter and ranged within acceptable values. Numbers for population were adjusted for the years in question from the latest census. The hour of day was assumed to be at night with maximum occupation. The assumption that the upper half of the Main Frontal Thrust (MFT) will rupture in companion earthquakes to historic earthquakes in the down-dip half is based on the observations of several meters of displacement in trenches across the MFT outcrop. Among mitigation measures awareness with training and adherence to construction codes rank highest. Retrofitting of schools and hospitals would save lives and prevent injuries. Preparation plans for helping millions of strongly affected people should be put in place. These mitigation efforts should focus on an approximately 7 km wide strip along the MFT on the up-thrown side because the strong motions are likely to be doubled. We emphasize that our estimates

  11. Quantitative Earthquake Prediction on Global and Regional Scales

    International Nuclear Information System (INIS)

    Kossobokov, Vladimir G.

    2006-01-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  12. Quantitative Earthquake Prediction on Global and Regional Scales

    Science.gov (United States)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  13. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  14. Wood-framed houses for earthquake zones

    DEFF Research Database (Denmark)

    Hansen, Klavs Feilberg

    Wood-framed houses with a sheathing are suitable for use in earthquake zones. The Direction describes a method of determining the earthquake forces in a house and shows how these forces can be resisted by diaphragm action in the walls, floors, and roof, of the house. An appendix explains how...

  15. Money matters: Rapid post-earthquake financial decision-making

    Science.gov (United States)

    Wald, David J.; Franco, Guillermo

    2016-01-01

    Post-earthquake financial decision-making is a realm beyond that of many people. In the immediate aftermath of a damaging earthquake, billions of dollars of relief, recovery, and insurance funds are in the balance through new financial instruments that allow those with resources to hedge against disasters and those at risk to limit their earthquake losses and receive funds for response and recovery.

  16. Earthquake Loss Scenarios: Warnings about the Extent of Disasters

    Science.gov (United States)

    Wyss, M.; Tolis, S.; Rosset, P.

    2016-12-01

    It is imperative that losses expected due to future earthquakes be estimated. Officials and the public need to be aware of what disaster is likely in store for them in order to reduce the fatalities and efficiently help the injured. Scenarios for earthquake parameters can be constructed to a reasonable accuracy in highly active earthquake belts, based on knowledge of seismotectonics and history. Because of the inherent uncertainties of loss estimates however, it would be desirable that more than one group calculate an estimate for the same area. By discussing these estimates, one may find a consensus of the range of the potential disasters and persuade officials and residents of the reality of the earthquake threat. To model a scenario and estimate earthquake losses requires data sets that are sufficiently accurate of the number of people present, the built environment, and if possible the transmission of seismic waves. As examples we use loss estimates for possible repeats of historic earthquakes in Greece that occurred between -464 and 700. We model future large Greek earthquakes as having M6.8 and rupture lengths of 60 km. In four locations where historic earthquakes with serious losses have occurred, we estimate that 1,000 to 1,500 people might perish, with an additional factor of four people injured. Defining the area of influence of these earthquakes as that with shaking intensities larger and equal to V, we estimate that 1.0 to 2.2 million people in about 2,000 settlements may be affected. We calibrate the QLARM tool for calculating intensities and losses in Greece, using the M6, 1999 Athens earthquake and matching the isoseismal information for six earthquakes, which occurred in Greece during the last 140 years. Comparing fatality numbers that would occur theoretically today with the numbers reported, and correcting for the increase in population, we estimate that the improvement of the building stock has reduced the mortality and injury rate in Greek

  17. Observation of earthquake in the neighborhood of a large underground cavity. The Izu-Hanto-Toho-Oki earthquake, June 29, 1980

    Energy Technology Data Exchange (ETDEWEB)

    Komada, H; Hayashi, M [Central Research Inst. of Electric Power Industry, Abiko, Chiba (Japan). Civil Engineering Lab.

    1980-12-01

    Studies on the earthquake resistance design of underground site for such large important structures as nuclear power plants, high-level radioactive waste repositories, LNG tanks, petroleum tanks, big power transmission installations and compressed air energy storage installations have been examined at our research institute. The observations of earthquake have been examined at Shiroyama underground hydroelectric power station since July 1976 as one of the demonstration of the earthquake resistance, and the first report was already published. After the time accelerometers and dynamic strain meters were additionally installed. Good acceleration waves and dynamic strain waves of the Izu-Hanto-Toho-Oki Earthquake, June 29, 1980 were observed at Shiroyama site, at which the hypocentral distance is 77 km and the intensity scale is about 4. In this report, the characteristic of the oscillation wave in the neighborhood of underground cavity and the relationships among accelerations, velocities, deformations and dynamic strains are studied in detail on the above earthquake data.

  18. The earthquake lights (EQL of the 6 April 2009 Aquila earthquake, in Central Italy

    Directory of Open Access Journals (Sweden)

    C. Fidani

    2010-05-01

    Full Text Available A seven-month collection of testimonials about the 6 April 2009 earthquake in Aquila, Abruzzo region, Italy, was compiled into a catalogue of non-seismic phenomena. Luminous phenomena were often reported starting about nine months before the strong shock and continued until about five months after the shock. A summary and list of the characteristics of these sightings was made according to 20th century classifications and a comparison was made with the Galli outcomes. These sightings were distributed over a large area around the city of Aquila, with a major extension to the north, up to 50 km. Various earthquake lights were correlated with several landscape characteristics and the source and dynamic of the earthquake. Some preliminary considerations on the location of the sightings suggest a correlation between electrical discharges and asperities, while flames were mostly seen along the Aterno Valley.

  19. 78 FR 64973 - Scientific Earthquake Studies Advisory Committee (SESAC)

    Science.gov (United States)

    2013-10-30

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] Scientific Earthquake Studies... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next... Survey (USGS) on matters relating to the USGS's participation in the National Earthquake Hazards...

  20. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey National Earthquake Prediction Evaluation... 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\-day meeting.... Geological Survey on proposed earthquake predictions, on the completeness and scientific validity of the...