WorldWideScience

Sample records for california earthquake center

  1. Building the Southern California Earthquake Center

    Science.gov (United States)

    Jordan, T. H.; Henyey, T.; McRaney, J. K.

    2004-12-01

    Kei Aki was the founding director of the Southern California Earthquake Center (SCEC), a multi-institutional collaboration formed in 1991 as a Science and Technology Center (STC) under the National Science Foundation (NSF) and the U. S. Geological Survey (USGS). Aki and his colleagues articulated a system-level vision for the Center: investigations by disciplinary working groups would be woven together into a "Master Model" for Southern California. In this presentation, we will outline how the Master-Model concept has evolved and how SCEC's structure has adapted to meet scientific challenges of system-level earthquake science. In its first decade, SCEC conducted two regional imaging experiments (LARSE I & II); published the "Phase-N" reports on (1) the Landers earthquake, (2) a new earthquake rupture forecast for Southern California, and (3) new models for seismic attenuation and site effects; it developed two prototype "Community Models" (the Crustal Motion Map and Community Velocity Model) and, perhaps most important, sustained a long-term, multi-institutional, interdisciplinary collaboration. The latter fostered pioneering numerical simulations of earthquake ruptures, fault interactions, and wave propagation. These accomplishments provided the impetus for a successful proposal in 2000 to reestablish SCEC as a "stand alone" center under NSF/USGS auspices. SCEC remains consistent with the founders' vision: it continues to advance seismic hazard analysis through a system-level synthesis that is based on community models and an ever expanding array of information technology. SCEC now represents a fully articulated "collaboratory" for earthquake science, and many of its features are extensible to other active-fault systems and other system-level collaborations. We will discuss the implications of the SCEC experience for EarthScope, the USGS's program in seismic hazard analysis, NSF's nascent Cyberinfrastructure Initiative, and other large collaboratory programs.

  2. Accessing northern California earthquake data via Internet

    Science.gov (United States)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  3. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    Science.gov (United States)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced

  4. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  5. Earthquakes and faults in southern California (1970-2010)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  6. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    Science.gov (United States)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  7. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    Science.gov (United States)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain

  8. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest

  9. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    Science.gov (United States)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ● The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ● The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ● The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  10. The October 1992 Parkfield, California, earthquake prediction

    Science.gov (United States)

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  11. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  12. UCERF3: A new earthquake forecast for California's complex fault system

    Science.gov (United States)

    Field, Edward H.; ,

    2015-01-01

    With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

  13. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    Science.gov (United States)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  14. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    Science.gov (United States)

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these

  15. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu

  16. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  17. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    International Nuclear Information System (INIS)

    Hough, Susan E.

    2008-01-01

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude

  18. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    Science.gov (United States)

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  19. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  20. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    Science.gov (United States)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing

  1. Post-Earthquake Traffic Capacity of Modern Bridges in California

    Science.gov (United States)

    2010-03-01

    Evaluation of the capacity of a bridge to carry self-weight and traffic loads after an earthquake is essential for a : safe and timely re-opening of the bridge. In California, modern highway bridges designed using the Caltrans : Seismic Design Criter...

  2. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  3. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    Science.gov (United States)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  4. Analysis of Earthquake Source Spectra in Salton Trough

    Science.gov (United States)

    Chen, X.; Shearer, P. M.

    2009-12-01

    Previous studies of the source spectra of small earthquakes in southern California show that average Brune-type stress drops vary among different regions, with particularly low stress drops observed in the Salton Trough (Shearer et al., 2006). The Salton Trough marks the southern end of the San Andreas Fault and is prone to earthquake swarms, some of which are driven by aseismic creep events (Lohman and McGuire, 2007). In order to learn the stress state and understand the physical mechanisms of swarms and slow slip events, we analyze the source spectra of earthquakes in this region. We obtain Southern California Seismic Network (SCSN) waveforms for earthquakes from 1977 to 2009 archived at the Southern California Earthquake Center (SCEC) data center, which includes over 17,000 events. After resampling the data to a uniform 100 Hz sample rate, we compute spectra for both signal and noise windows for each seismogram, and select traces with a P-wave signal-to-noise ratio greater than 5 between 5 Hz and 15 Hz. Using selected displacement spectra, we isolate the source spectra from station terms and path effects using an empirical Green’s function approach. From the corrected source spectra, we compute corner frequencies and estimate moments and stress drops. Finally we analyze spatial and temporal variations in stress drop in the Salton Trough and compare them with studies of swarms and creep events to assess the evolution of faulting and stress in the region. References: Lohman, R. B., and J. J. McGuire (2007), Earthquake swarms driven by aseismic creep in the Salton Trough, California, J. Geophys. Res., 112, B04405, doi:10.1029/2006JB004596 Shearer, P. M., G. A. Prieto, and E. Hauksson (2006), Comprehensive analysis of earthquake source spectra in southern California, J. Geophys. Res., 111, B06303, doi:10.1029/2005JB003979.

  5. Self-potential variations preceding earthquakes in central california

    International Nuclear Information System (INIS)

    Corwin, R.F.; Morrison, H.G.

    1977-01-01

    Two earthquakes in central California were preceded by anomalous variations in the horizontal electric field (self-potential) of the earth. The first variation was an anomaly of 90 mV amplitude across electrode dipoles of 630 and 640 m, which began 55 days before an earthquake of M=5, located 37 km NW of the dipoles. The second variation had an amplitude of 4 mV across a 300 m dipole, and began 110 hours before an event of M=2.4 located on the San Andreas fault, 2.5 km from the dipole. Streaming potentials generated by the flow of groundwater into a dilatant zone are proposed as a possible mechanism for the observed variations

  6. Aftershocks and triggered events of the Great 1906 California earthquake

    Science.gov (United States)

    Meltzner, A.J.; Wald, D.J.

    2003-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults in the world, yet little is known about the aftershocks following the most recent great event on the San Andreas, the Mw 7.8 San Francisco earthquake on 18 April 1906. We conducted a study to locate and to estimate magnitudes for the largest aftershocks and triggered events of this earthquake. We examined existing catalogs and historical documents for the period April 1906 to December 1907, compiling data on the first 20 months of the aftershock sequence. We grouped felt reports temporally and assigned modified Mercalli intensities for the larger events based on the descriptions judged to be the most reliable. For onshore and near-shore events, a grid-search algorithm (derived from empirical analysis of modern earthquakes) was used to find the epicentral location and magnitude most consistent with the assigned intensities. For one event identified as far offshore, the event's intensity distribution was compared with those of modern events, in order to contrain the event's location and magnitude. The largest aftershock within the study period, an M ???6.7 event, occurred ???100 km west of Eureka on 23 April 1906. Although not within our study period, another M ???6.7 aftershock occurred near Cape Mendocino on 28 October 1909. Other significant aftershocks included an M ???5.6 event near San Juan Bautista on 17 May 1906 and an M ???6.3 event near Shelter Cove on 11 August 1907. An M ???4.9 aftershock occurred on the creeping segment of the San Andreas fault (southeast of the mainshock rupture) on 6 July 1906. The 1906 San Francisco earthquake also triggered events in southern California (including separate events in or near the Imperial Valley, the Pomona Valley, and Santa Monica Bay), in western Nevada, in southern central Oregon, and in western Arizona, all within 2 days of the mainshock. Of these trigerred events, the largest were an M ???6.1 earthquake near Brawley

  7. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  8. [Engineering aspects of seismic behavior of health-care facilities: lessons from California earthquakes].

    Science.gov (United States)

    Rutenberg, A

    1995-03-15

    The construction of health-care facilities is similar to that of other buildings. Yet the need to function immediately after an earthquake, the helplessness of the many patients and the high and continuous occupancy of these buildings, require that special attention be paid to their seismic performance. Here the lessons from the California experience are invaluable. In this paper the behavior of California hospitals during destructive earthquakes is briefly described. Adequate structural design and execution, and securing of nonstructural elements are required to ensure both safety of occupants, and practically uninterrupted functioning of equipment, mechanical and electrical services and other vital systems. Criteria for post-earthquake functioning are listed. In view of the hazards to Israeli hospitals, in particular those located along the Jordan Valley and the Arava, a program for the seismic evaluation of medical facilities should be initiated. This evaluation should consider the hazards from nonstructural elements, the safety of equipment and systems, and their ability to function after a severe earthquake. It should not merely concentrate on safety-related structural behavior.

  9. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Science.gov (United States)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  10. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  11. Earthquake potential in California-Nevada implied by correlation of strain rate and seismicity

    Science.gov (United States)

    Zeng, Yuehua; Petersen, Mark D.; Shen, Zheng-Kang

    2018-01-01

    Rock mechanics studies and dynamic earthquake simulations show that patterns of seismicity evolve with time through (1) accumulation phase, (2) localization phase, and (3) rupture phase. We observe a similar pattern of changes in seismicity during the past century across California and Nevada. To quantify these changes, we correlate GPS strain rates with seismicity. Earthquakes of M > 6.5 are collocated with regions of highest strain rates. By contrast, smaller magnitude earthquakes of M ≥ 4 show clear spatiotemporal changes. From 1933 to the late 1980s, earthquakes of M ≥ 4 were more diffused and broadly distributed in both high and low strain rate regions (accumulation phase). From the late 1980s to 2016, earthquakes were more concentrated within the high strain rate areas focused on the major fault strands (localization phase). In the same time period, the rate of M > 6.5 events also increased significantly in the high strain rate areas. The strong correlation between current strain rate and the later period of seismicity indicates that seismicity is closely related to the strain rate. The spatial patterns suggest that before the late 1980s, the strain rate field was also broadly distributed because of the stress shadows from previous large earthquakes. As the deformation field evolved out of the shadow in the late 1980s, strain has refocused on the major fault systems and we are entering a period of increased risk for large earthquakes in California.

  12. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    Science.gov (United States)

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  13. Foreshocks and aftershocks of the Great 1857 California earthquake

    Science.gov (United States)

    Meltzner, A.J.; Wald, D.J.

    1999-01-01

    The San Andreas fault is the longest fault in California and one of the longest strike-slip faults anywhere in the world, yet we know little about many aspects of its behavior before, during, and after large earthquakes. We conducted a study to locate and to estimate magnitudes for the largest foreshocks and aftershocks of the 1857 M 7.9 Fort Tejon earthquake on the central and southern segments of the fault. We began by searching archived first-hand accounts from 1857 through 1862, by grouping felt reports temporally, and by assigning modified Mercalli intensities to each site. We then used a modified form of the grid-search algorithm of Bakum and Wentworth, derived from empirical analysis of modern earthquakes, to find the location and magnitude most consistent with the assigned intensities for each of the largest events. The result confirms a conclusion of Sieh that at least two foreshocks ('dawn' and 'sunrise') located on or near the Parkfield segment of the San Andreas fault preceded the mainshock. We estimate their magnitudes to be M ~ 6.1 and M ~ 5.6, respectively. The aftershock rate was below average but within one standard deviation of the number of aftershocks expected based on statistics of modern southern California mainshock-aftershock sequences. The aftershocks included two significant events during the first eight days of the sequence, with magnitudes M ~ 6.25 and M ~ 6.7, near the southern half of the rupture; later aftershocks included a M ~ 6 event near San Bernardino in December 1858 and a M ~ 6.3 event near the Parkfield segment in April 1860. From earthquake logs at Fort Tejon, we conclude that the aftershock sequence lasted a minimum of 3.75 years.

  14. Injuries and Traumatic Psychological Exposures Associated with the South Napa Earthquake - California, 2014.

    Science.gov (United States)

    Attfield, Kathleen R; Dobson, Christine B; Henn, Jennifer B; Acosta, Meileen; Smorodinsky, Svetlana; Wilken, Jason A; Barreau, Tracy; Schreiber, Merritt; Windham, Gayle C; Materna, Barbara L; Roisman, Rachel

    2015-09-11

    On August 24, 2014, at 3:20 a.m., a magnitude 6.0 earthquake struck California, with its epicenter in Napa County (1). The earthquake was the largest to affect the San Francisco Bay area in 25 years and caused significant damage in Napa and Solano counties, including widespread power outages, five residential fires, and damage to roadways, waterlines, and 1,600 buildings (2). Two deaths resulted (2). On August 25, Napa County Public Health asked the California Department of Public Health (CDPH) for assistance in assessing postdisaster health effects, including earthquake-related injuries and effects on mental health. On September 23, Solano County Public Health requested similar assistance. A household-level Community Assessment for Public Health Emergency Response (CASPER) was conducted for these counties in two cities (Napa, 3 weeks after the earthquake, and Vallejo, 6 weeks after the earthquake). Among households reporting injuries, a substantial proportion (48% in Napa and 37% in western Vallejo) reported that the injuries occurred during the cleanup period, suggesting that increased messaging on safety precautions after a disaster might be needed. One fifth of respondents overall (27% in Napa and 9% in western Vallejo) reported one or more traumatic psychological exposures in their households. These findings were used by Napa County Mental Health to guide immediate-term mental health resource allocations and to conduct public training sessions and education campaigns to support persons with mental health risks following the earthquake. In addition, to promote community resilience and future earthquake preparedness, Napa County Public Health subsequently conducted community events on the earthquake anniversary and provided outreach workers with psychological first aid training.

  15. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  16. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    Science.gov (United States)

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    The December 22, 2003, San Simeon, California, (M6.5) earthquake caused damage to houses, road surfaces, and underground utilities in Oceano, California. The community of Oceano is approximately 50 miles (80 km) from the earthquake epicenter. Damage at this distance from a M6.5 earthquake is unusual. To understand the causes of this damage, the U.S. Geological Survey conducted extensive subsurface exploration and monitoring of aftershocks in the months after the earthquake. The investigation included 37 seismic cone penetration tests, 5 soil borings, and aftershock monitoring from January 28 to March 7, 2004. The USGS investigation identified two earthquake hazards in Oceano that explain the San Simeon earthquake damage?site amplification and liquefaction. Site amplification is a phenomenon observed in many earthquakes where the strength of the shaking increases abnormally in areas where the seismic-wave velocity of shallow geologic layers is low. As a result, earthquake shaking is felt more strongly than in surrounding areas without similar geologic conditions. Site amplification in Oceano is indicated by the physical properties of the geologic layers beneath Oceano and was confirmed by monitoring aftershocks. Liquefaction, which is also commonly observed during earthquakes, is a phenomenon where saturated sands lose their strength during an earthquake and become fluid-like and mobile. As a result, the ground may undergo large permanent displacements that can damage underground utilities and well-built surface structures. The type of displacement of major concern associated with liquefaction is lateral spreading because it involves displacement of large blocks of ground down gentle slopes or towards stream channels. The USGS investigation indicates that the shallow geologic units beneath Oceano are very susceptible to liquefaction. They include young sand dunes and clean sandy artificial fill that was used to bury and convert marshes into developable lots. Most of

  17. Impact of a Large San Andreas Fault Earthquake on Tall Buildings in Southern California

    Science.gov (United States)

    Krishnan, S.; Ji, C.; Komatitsch, D.; Tromp, J.

    2004-12-01

    In 1857, an earthquake of magnitude 7.9 occurred on the San Andreas fault, starting at Parkfield and rupturing in a southeasterly direction for more than 300~km. Such a unilateral rupture produces significant directivity toward the San Fernando and Los Angeles basins. The strong shaking in the basins due to this earthquake would have had a significant long-period content (2--8~s). If such motions were to happen today, they could have a serious impact on tall buildings in Southern California. In order to study the effects of large San Andreas fault earthquakes on tall buildings in Southern California, we use the finite source of the magnitude 7.9 2001 Denali fault earthquake in Alaska and map it onto the San Andreas fault with the rupture originating at Parkfield and proceeding southward over a distance of 290~km. Using the SPECFEM3D spectral element seismic wave propagation code, we simulate a Denali-like earthquake on the San Andreas fault and compute ground motions at sites located on a grid with a 2.5--5.0~km spacing in the greater Southern California region. We subsequently analyze 3D structural models of an existing tall steel building designed in 1984 as well as one designed according to the current building code (Uniform Building Code, 1997) subjected to the computed ground motion. We use a sophisticated nonlinear building analysis program, FRAME3D, that has the ability to simulate damage in buildings due to three-component ground motion. We summarize the performance of these structural models on contour maps of carefully selected structural performance indices. This study could benefit the city in laying out emergency response strategies in the event of an earthquake on the San Andreas fault, in undertaking appropriate retrofit measures for tall buildings, and in formulating zoning regulations for new construction. In addition, the study would provide risk data associated with existing and new construction to insurance companies, real estate developers, and

  18. Triggered surface slips in southern California associated with the 2010 El Mayor-Cucapah, Baja California, Mexico, earthquake

    Science.gov (United States)

    Rymer, Michael J.; Treiman, Jerome A.; Kendrick, Katherine J.; Lienkaemper, James J.; Weldon, Ray J.; Bilham, Roger; Wei, Meng; Fielding, Eric J.; Hernandez, Janis L.; Olson, Brian P.E.; Irvine, Pamela J.; Knepprath, Nichole; Sickler, Robert R.; Tong, Xiaopeng; Siem, Martin E.

    2011-01-01

    The April 4, 2010 (Mw7.2), El Mayor-Cucapah, Baja California, Mexico, earthquake is the strongest earthquake to shake the Salton Trough area since the 1992 (Mw7.3) Landers earthquake. Similar to the Landers event, ground-surface fracturing occurred on multiple faults in the trough. However, the 2010 event triggered surface slip on more faults in the central Salton Trough than previous earthquakes, including multiple faults in the Yuha Desert area, the southwestern section of the Salton Trough. In the central Salton Trough, surface fracturing occurred along the southern San Andreas, Coyote Creek, Superstition Hills, Wienert, Kalin, and Imperial Faults and along the Brawley Fault Zone, all of which are known to have slipped in historical time, either in primary (tectonic) slip and/or in triggered slip. Surface slip in association with the El Mayor-Cucapah earthquake is at least the eighth time in the past 42 years that a local or regional earthquake has triggered slip along faults in the central Salton Trough. In the southwestern part of the Salton Trough, surface fractures (triggered slip) occurred in a broad area of the Yuha Desert. This is the first time that triggered slip has been observed in the southwestern Salton Trough.

  19. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  20. Triggered seismicity and deformation between the Landers, California, and Little Skull Mountain, Nevada, earthquakes

    Science.gov (United States)

    Bodin, Paul; Gomberg, Joan

    1994-01-01

    This article presents evidence for the channeling of strain energy released by the Ms = 7.4 Landers, California, earthquake within the eastern California shear zone (ECSZ). We document an increase in seismicity levels during the 22-hr period starting with the Landers earthquake and culminating 22 hr later with the Ms = 5.4 Little Skull Mountain (LSM), Nevada, earthquake. We evaluate the completeness of regional seismicity catalogs during this period and find that the continuity of post-Landers strain release within the ECSZ is even more pronounced than is evident from the catalog data. We hypothesize that regional-scale connectivity of faults within the ECSZ and LSM region is a critical ingredient in the unprecedented scale and distribution of remotely triggered earthquakes and geodetically manifest strain changes that followed the Landers earthquake. The viability of static strain changes as triggering agents is tested using numerical models. Modeling results illustrate that regional-scale fault connectivity can increase the static strain changes by approximately an order of magnitude at distances of at least 280 km, the distance between the Landers and LSM epicenters. This is possible for models that include both a network of connected faults that slip “sympathetically” and realistic levels of tectonic prestrain. Alternatively, if dynamic strains are a more significant triggering agent than static strains, ECSZ structure may still be important in determining the distribution of triggered seismic and aseismic deformation.

  1. The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami

    Science.gov (United States)

    Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.

    2011-12-01

    The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of

  2. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  3. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    Science.gov (United States)

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul S.; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  4. Radiated Seismic Energy of Earthquakes in the South-Central Region of the Gulf of California, Mexico

    Science.gov (United States)

    Castro, Raúl R.; Mendoza-Camberos, Antonio; Pérez-Vertti, Arturo

    2018-05-01

    We estimated the radiated seismic energy (ES) of 65 earthquakes located in the south-central region of the Gulf of California. Most of these events occurred along active transform faults that define the Pacific-North America plate boundary and have magnitudes between M3.3 and M5.9. We corrected the spectral records for attenuation using nonparametric S-wave attenuation functions determined with the whole data set. The path effects were isolated from the seismic source using a spectral inversion. We computed radiated seismic energy of the earthquakes by integrating the square velocity source spectrum and estimated their apparent stresses. We found that most events have apparent stress between 3 × 10-4 and 3 MPa. Model independent estimates of the ratio between seismic energy and moment (ES/M0) indicates that this ratio is independent of earthquake size. We conclude that in general the apparent stress is low (σa < 3 MPa) in the south-central and southern Gulf of California.

  5. Frequency-Dependent Tidal Triggering of Low Frequency Earthquakes Near Parkfield, California

    Science.gov (United States)

    Xue, L.; Burgmann, R.; Shelly, D. R.

    2017-12-01

    The effect of small periodic stress perturbations on earthquake generation is not clear, however, the rate of low-frequency earthquakes (LFEs) near Parkfield, California has been found to be strongly correlated with solid earth tides. Laboratory experiments and theoretical analyses show that the period of imposed forcing and source properties affect the sensitivity to triggering and the phase relation of the peak seismicity rate and the periodic stress, but frequency-dependent triggering has not been quantitatively explored in the field. Tidal forcing acts over a wide range of frequencies, therefore the sensitivity to tidal triggering of LFEs provides a good probe to the physical mechanisms affecting earthquake generation. In this study, we consider the tidal triggering of LFEs near Parkfield, California since 2001. We find the LFEs rate is correlated with tidal shear stress, normal stress rate and shear stress rate. The occurrence of LFEs can also be independently modulated by groups of tidal constituents at semi-diurnal, diurnal and fortnightly frequencies. The strength of the response of LFEs to the different tidal constituents varies between LFE families. Each LFE family has an optimal triggering frequency, which does not appear to be depth dependent or systematically related to other known properties. This suggests the period of the applied forcing plays an important role in the triggering process, and the interaction of periods of loading history and source region properties, such as friction, effective normal stress and pore fluid pressure, produces the observed frequency-dependent tidal triggering of LFEs.

  6. The Loma Prieta, California, Earthquake of October 17, 1989: Strong Ground Motion and Ground Failure

    Science.gov (United States)

    Coordinated by Holzer, Thomas L.

    1992-01-01

    Professional Paper 1551 describes the effects at the land surface caused by the Loma Prieta earthquake. These effects: include the pattern and characteristics of strong ground shaking, liquefaction of both floodplain deposits along the Pajaro and Salinas Rivers in the Monterey Bay region and sandy artificial fills along the margins of San Francisco Bay, landslides in the epicentral region, and increased stream flow. Some significant findings and their impacts were: * Strong shaking that was amplified by a factor of about two by soft soils caused damage at up to 100 kilometers (60 miles) from the epicenter. * Instrumental recordings of the ground shaking have been used to improve how building codes consider site amplification effects from soft soils. * Liquefaction at 134 locations caused $99.2 million of the total earthquake loss of $5.9 billion. Liquefaction of floodplain deposits and sandy artificial fills was similar in nature to that which occurred in the 1906 San Francisco earthquake and indicated that many areas remain susceptible to liquefaction damage in the San Francisco and Monterey Bay regions. * Landslides caused $30 million in earthquake losses, damaging at least 200 residences. Many landslides showed evidence of movement in previous earthquakes. * Recognition of the similarities between liquefaction and landslides in 1906 and 1989 and research in intervening years that established methodologies to map liquefaction and landslide hazards prompted the California legislature to pass in 1990 the Seismic Hazards Mapping Act that required the California Geological Survey to delineate regulatory zones of areas potentially susceptible to these hazards. * The earthquake caused the flow of many streams in the epicentral region to increase. Effects were noted up to 88 km from the epicenter. * Post-earthquake studies of the Marina District of San Francisco provide perhaps the most comprehensive case history of earthquake effects at a specific site developed for

  7. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  8. Evaluation of Real-Time Performance of the Virtual Seismologist Earthquake Early Warning Algorithm in Switzerland and California

    Science.gov (United States)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Heaton, T. H.

    2012-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms - the other two being ElarmS (Allen and Kanamori, 2003) and On-Site (Wu and Kanamori, 2005; Boese et al., 2008) algorithms - that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS will be installed and tested at other European networks. VS has been running in real-time on stations of the Southern California Seismic Network (SCSN) since July 2008, and on stations of the Berkeley Digital Seismic Network (BDSN) and the USGS Menlo Park strong motion network in northern California since February 2009. In Switzerland, VS has been running in real-time on stations monitored by the Swiss Seismological Service (including stations from Austria, France, Germany, and Italy) since 2010. We present summaries of the real-time performance of VS in Switzerland and California over the past two and three years respectively. The empirical relationships used by VS to estimate magnitudes and ground motion, originally derived from southern California data, are demonstrated to perform well in northern California and Switzerland. Implementation in real-time and off-line testing in Europe will potentially be extended to southern Italy, western Greece, Istanbul, Romania, and Iceland. Integration of the VS algorithm into both the CISN Advanced

  9. Final Environmental Assessment for the California Space Center at Vandenberg Air Force Base, California

    Science.gov (United States)

    2010-06-02

    rooted , mesophylic plant species that Chapter 3. Affected Environment Final Environmental Assessment - California Space Center, Vandenberg Air...Chapter 3. Affected Environment 3-12 Final Environmental Assessment - California Space Center, Vandenberg Air Force Base the root and debris zone of the...protruding objects, slippery soils or mud, and biological hazards including vegetation (i.e. poison oak and stinging nettle ), animals (i.e. insects

  10. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  11. The Napa (California, US) earthquake of 24 August 2014 (10.24 UT) Magnitude = 6.0

    International Nuclear Information System (INIS)

    Scotti, Oona

    2014-01-01

    This publication briefly presents the characteristics of an earthquake which occurred in California in August 2014, indicates some data recorded by local seismic stations, and gives a brief overview of human and economic damages. It analyses the geological location of the earthquake, recalls previous events and outlines the local seismic risk. After having noticed that there was no consequence for the closest nuclear power station (300 km away), it indicates lessons learned in terms of seismic event about a crack, in order to better assess the risk of surface failure

  12. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  13. Hospital compliance with a state unfunded mandate: the case of California's Earthquake Safety Law.

    Science.gov (United States)

    McCue, Michael J; Thompson, Jon M

    2012-01-01

    Abstract In recent years, community hospitals have experienced heightened regulation with many unfunded mandates. The authors assessed the market, organizational, operational, and financial characteristics of general acute care hospitals in California that have a main acute care hospital building that is noncompliant with state requirements and at risk of major structural collapse from earthquakes. Using California hospital data from 2007 to 2009, and employing logistic regression analysis, the authors found that hospitals having buildings that are at the highest risk of collapse are located in larger population markets, possess smaller market share, have a higher percentage of Medicaid patients, and have less liquidity.

  14. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    Science.gov (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  15. GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake

    Science.gov (United States)

    Granat, Robert; Donnellan, Andrea

    2011-01-01

    The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

  16. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.

    2015-12-01

    Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being

  17. Unusual downhole and surface free-field records near the Carquinez Strait bridges during the 24 August 2014 Mw6.0 South Napa, California earthquake

    Science.gov (United States)

    Çelebi, Mehmet; Ghahari, S. Farid; Taciroglu, Ertugrul

    2015-01-01

    This paper reports the results of Part A of a study of the recorded strong-motion accelerations at the well-instrumented network of the two side-by-side parallel bridges over the Carquinez Strait during the 24 August 2014 (Mw6.0 ) South Napa, Calif. earthquake that occurred at 03:20:44 PDT with epicentral coordinates 38.22N, 122.31W. (http://earthquake.usgs.gov/earthquakes/eqarchives/poster/2014/20140824.php, last accessed on October 17, 2014). Both bridges and two boreholes were instrumented by the California Strong motion Instrumentation Program (CSMIP) of California Geological Survey (CGS) (Shakal et al., 2014). A comprehensive comparison of several ground motion prediction equations as they relate to recorded ground motions of the earthquake is provided by Baltay and Boatright (2015).

  18. Characterizing potentially induced earthquake rate changes in the Brawley Seismic Zone, southern California

    Science.gov (United States)

    Llenos, Andrea L.; Michael, Andrew J.

    2016-01-01

    The Brawley seismic zone (BSZ), in the Salton trough of southern California, has a history of earthquake swarms and geothermal energy exploitation. Some earthquake rate changes may have been induced by fluid extraction and injection activity at local geothermal fields, particularly at the North Brawley Geothermal Field (NBGF) and at the Salton Sea Geothermal Field (SSGF). We explore this issue by examining earthquake rate changes and interevent distance distributions in these fields. In Oklahoma and Arkansas, where considerable wastewater injection occurs, increases in background seismicity rate and aftershock productivity and decreases in interevent distance were indicative of fluid‐injection‐induced seismicity. Here, we test if similar changes occur that may be associated with fluid injection and extraction in geothermal areas. We use stochastic epidemic‐type aftershock sequence models to detect changes in the underlying seismogenic processes, shown by statistically significant changes in the model parameters. The most robust model changes in the SSGF roughly occur when large changes in net fluid production occur, but a similar correlation is not seen in the NBGF. Also, although both background seismicity rate and aftershock productivity increased for fluid‐injection‐induced earthquake rate changes in Oklahoma and Arkansas, the background rate increases significantly in the BSZ only, roughly corresponding with net fluid production rate increases. Moreover, in both fields the interevent spacing does not change significantly during active energy projects. This suggests that, although geothermal field activities in a tectonically active region may not significantly change the physics of earthquake interactions, earthquake rates may still be driven by fluid injection or extraction rates, particularly in the SSGF.

  19. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    Science.gov (United States)

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  20. Rapid finite-fault inversions in Southern California using Cybershake Green's functions

    Science.gov (United States)

    Thio, H. K.; Polet, J.

    2017-12-01

    We have developed a system for rapid finite fault inversion for intermediate and large Southern California earthquakes using local, regional and teleseismic seismic waveforms as well as geodetic data. For modeling the local seismic data, we use 3D Green's functions from the Cybershake project, which were made available to us courtesy of the Southern California Earthquake Center (SCEC). The use of 3D Green's functions allows us to extend the inversion to higher frequency waveform data and smaller magnitude earthquakes, in addition to achieving improved solutions in general. The ultimate aim of this work is to develop the ability to provide high quality finite fault models within a few hours after any damaging earthquake in Southern California, so that they may be used as input to various post-earthquake assessment tools such as ShakeMap, as well as by the scientific community and other interested parties. Additionally, a systematic determination of finite fault models has value as a resource for scientific studies on detailed earthquake processes, such as rupture dynamics and scaling relations. We are using an established least-squares finite fault inversion method that has been applied extensively both on large as well as smaller regional earthquakes, in conjunction with the 3D Green's functions, where available, as well as 1D Green's functions for areas for which the Cybershake library has not yet been developed. We are carrying out validation and calibration of this system using significant earthquakes that have occurred in the region over the last two decades, spanning a range of locations and magnitudes (5.4 and higher).

  1. Responses of a tall building in Los Angeles, California as inferred from local and distant earthquakes

    Science.gov (United States)

    Çelebi, Mehmet; Hasan Ulusoy,; Nori Nakata,

    2016-01-01

    Increasing inventory of tall buildings in the United States and elsewhere may be subjected to motions generated by near and far seismic sources that cause long-period effects. Multiple sets of records that exhibited such effects were retrieved from tall buildings in Tokyo and Osaka ~ 350 km and 770 km from the epicenter of the 2011 Tohoku earthquake. In California, very few tall buildings have been instrumented. An instrumented 52-story building in downtown Los Angeles recorded seven local and distant earthquakes. Spectral and system identification methods exhibit significant low frequencies of interest (~0.17 Hz, 0.56 Hz and 1.05 Hz). These frequencies compare well with those computed by transfer functions; however, small variations are observed between the significant low frequencies for each of the seven earthquakes. The torsional and translational frequencies are very close and are coupled. Beating effect is observed in at least two of the seven earthquake data.

  2. Acceleration and volumetric strain generated by the Parkfield 2004 earthquake on the GEOS strong-motion array near Parkfield, California

    Science.gov (United States)

    Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher

    2004-01-01

    An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site

  3. Spatial-temporal variation of low-frequency earthquake bursts near Parkfield, California

    Science.gov (United States)

    Wu, Chunquan; Guyer, Robert; Shelly, David R.; Trugman, D.; Frank, William; Gomberg, Joan S.; Johnson, P.

    2015-01-01

    Tectonic tremor (TT) and low-frequency earthquakes (LFEs) have been found in the deeper crust of various tectonic environments globally in the last decade. The spatial-temporal behaviour of LFEs provides insight into deep fault zone processes. In this study, we examine recurrence times from a 12-yr catalogue of 88 LFE families with ∼730 000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault (SAF) in central California. We apply an automatic burst detection algorithm to the LFE recurrence times to identify the clustering behaviour of LFEs (LFE bursts) in each family. We find that the burst behaviours in the northern and southern LFE groups differ. Generally, the northern group has longer burst duration but fewer LFEs per burst, while the southern group has shorter burst duration but more LFEs per burst. The southern group LFE bursts are generally more correlated than the northern group, suggesting more coherent deep fault slip and relatively simpler deep fault structure beneath the locked section of SAF. We also found that the 2004 Parkfield earthquake clearly increased the number of LFEs per burst and average burst duration for both the northern and the southern groups, with a relatively larger effect on the northern group. This could be due to the weakness of northern part of the fault, or the northwesterly rupture direction of the Parkfield earthquake.

  4. Varenna workshop report. Operational earthquake forecasting and decision making

    Directory of Open Access Journals (Sweden)

    Warner Marzocchi

    2015-09-01

    Full Text Available A workshop on Operational earthquake forecasting and decision making was convened in Varenna, Italy, on June 8-11, 2014, under the sponsorship of the EU FP 7 REAKT (Strategies and tools for Real-time EArthquake risK reducTion project, the Seismic Hazard Center at the Istituto Nazionale di Geofisica e Vulcanologia (INGV, and the Southern California Earthquake Center (SCEC. The main goal was to survey the interdisciplinary issues of operational earthquake forecasting (OEF, including the problems that OEF raises for decision making and risk communication. The workshop was attended by 64 researchers from universities, research centers, and governmental institutions in 11 countries. Participants and the workshop agenda are listed in the appendix.The workshop comprised six topical sessions structured around three main themes: the science of operational earthquake forecasting, decision making in a low-probability environment, and communicating hazard and risk. Each topic was introduced by a moderator and surveyed by a few invited speakers, who were then empaneled for an open discussion. The presentations were followed by poster sessions. During a wrap-up session on the last day, the reporters for each topical session summarized the main points that they had gleaned from the talks and open discussions. This report attempts to distill this workshop record into a brief overview of the workshop themes and to describe the range of opinions expressed during the discussions.

  5. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    Science.gov (United States)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  6. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    Science.gov (United States)

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  7. TPMG Northern California appointments and advice call center.

    Science.gov (United States)

    Conolly, Patricia; Levine, Leslie; Amaral, Debra J; Fireman, Bruce H; Driscoll, Tom

    2005-08-01

    Kaiser Permanente (KP) has been developing its use of call centers as a way to provide an expansive set of healthcare services to KP members efficiently and cost effectively. Since 1995, when The Permanente Medical Group (TPMG) began to consolidate primary care phone services into three physical call centers, the TPMG Appointments and Advice Call Center (AACC) has become the "front office" for primary care services across approximately 89% of Northern California. The AACC provides primary care phone service for approximately 3 million Kaiser Foundation Health Plan members in Northern California and responds to approximately 1 million calls per month across the three AACC sites. A database records each caller's identity as well as the day, time, and duration of each call; reason for calling; services provided to callers as a result of calls; and clinical outcomes of calls. We here summarize this information for the period 2000 through 2003.

  8. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  9. Fluid-faulting interactions: Fracture-mesh and fault-valve behavior in the February 2014 Mammoth Mountain, California, earthquake swarm

    Science.gov (United States)

    Shelly, David R.; Taira, Taka’aki; Prejean, Stephanie; Hill, David P.; Dreger, Douglas S.

    2015-01-01

    Faulting and fluid transport in the subsurface are highly coupled processes, which may manifest seismically as earthquake swarms. A swarm in February 2014 beneath densely monitored Mammoth Mountain, California, provides an opportunity to witness these interactions in high resolution. Toward this goal, we employ massive waveform-correlation-based event detection and relative relocation, which quadruples the swarm catalog to more than 6000 earthquakes and produces high-precision locations even for very small events. The swarm's main seismic zone forms a distributed fracture mesh, with individual faults activated in short earthquake bursts. The largest event of the sequence, M 3.1, apparently acted as a fault valve and was followed by a distinct wave of earthquakes propagating ~1 km westward from the updip edge of rupture, 1–2 h later. Late in the swarm, multiple small, shallower subsidiary faults activated with pronounced hypocenter migration, suggesting that a broader fluid pressure pulse propagated through the subsurface.

  10. Examining the Use of the Cloud for Seismic Data Centers

    Science.gov (United States)

    Yu, E.; Meisenhelter, S.; Clayton, R. W.

    2011-12-01

    The Southern California Earthquake Data Center (SCEDC) archives seismic and station sensor metadata related to earthquake activity in southern California. It currently archives nearly 8400 data streams continuously from over 420 stations in near real time at a rate of 584 GB/month to a repository approximately 18 TB in size. Triggered waveform data from an average 12,000 earthquakes/year is also archived. Data are archived on mirrored disk arrays that are maintained and backed-up locally. These data are served over the Internet to scientists and the general public in many countries. The data demand has a steady component, largely needed for ambient noise correlation studies, and an impulsive component that is driven by earthquake activity. Designing a reliable, cost effective, system architecture equipped to handle periods of relatively low steady demand punctuated by unpredictable sharp spikes in demand immediately following a felt earthquake remains a major challenge. To explore an alternative paradigm, we have put one-month of the data in the "cloud" and have developed a user interface with the Google Apps Engine. The purpose is to assess the modifications in data structures that are necessary to make efficient searches. To date we have determined that the database schema must be "denormalized" to take advantage of the dynamic computational capabilities, and that it is likely advantageous to preprocess the waveform data to remove overlaps, gaps, and other artifacts. The final purpose of this study is to compare the cost of the cloud compared to ground-based centers. The major motivations for this study are the security and dynamic load capabilities of the cloud. In the cloud, multiple copies of the data are held in distributed centers thus eliminating the single point of failure associated with one center. The cloud can dynamically increase the level of computational resources during an earthquake, and the major tasks of managing a disk farm are eliminated. The

  11. Effects of November 8, 1980 earthquake on Humboldt Bay Power Plant and Eureka, California area. Reconnaissance report 13 Nov-14 Nov 80

    International Nuclear Information System (INIS)

    Herring, K.S.; Rooney, V.; Chokshi, N.C.

    1981-06-01

    On November 8, 1980, an earthquake of a reported surface wave magnitude of 7.0 occurred off the coast of California, west of Eureka and the Humboldt Bay Power Plant. Three NRC staff members visited the site the following week to survey any damage associated with the earthquake, with the objective of using collected data to assist the NRR staff in ongoing seismic evaluations of older operating nuclear power plant facilities. This report contains their observations. They concluded that the effects of the earthquake on Humboldt Bay Power Plant Unit 3 were minimal and did not endanger the health and safety of the public. They recommended that improvements be made to seismic recording equipment and that generic preparation for future post-earthquake reconnaissance trips be made before the actual occurrence of earthquakes

  12. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    Science.gov (United States)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  13. Earthquake outlook for the San Francisco Bay region 2014–2043

    Science.gov (United States)

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  14. Reply to “Comment on “Should Memphis build for California's earthquakes?” From A.D. Frankel”

    Science.gov (United States)

    Stein, Seth; Tomasello, Joseph; Newman, Andrew

    Carl Sagan observed that “extraordinary claims require extraordinary evidence.” In our view, A.D. Frankel's arguments (see accompanying Comment piece) do not reach the level required to demonstrate the counter-intuitive propositions that the earthquake hazard in the New Madrid Seismic Zone (NMSZ) is comparable to that in coastal California, and that buildings should be built to similar standards.This interchange is the latest in an ongoing debate beginning with Newman et al.'s [1999a] recommendation, based on analysis of Global Positioning System and earthquake data, that Frankel et al.'s [1996] estimate of California-level seismic hazard for the NMSZ should be reduced. Most points at issue, except for those related to the costs and benefits of the proposed new International Building Code 2000, have already been argued at length by both sides in the literature [e.g.,Schweig et al., 1999; Newman et al., 1999b, 2001; Cramer, 2001]. Hence,rather than rehash these points, we will try here to provide readers not enmeshed in this morass with an overview of the primary differences between our view and that of Frankel.

  15. Studies of earthquakes stress drops, seismic scattering, and dynamic triggering in North America

    Science.gov (United States)

    Escudero Ayala, Christian Rene

    I use the Relative Source Time Function (RSTF) method to determine the source properties of earthquakes within southeastern Alaska-northwestern Canada in a first part of the project, and earthquakes within the Denali fault in a second part. I deconvolve a small event P-arrival signal from a larger event by the following method: select arrivals with a tapered cosine window, fast fourier transform to obtain the spectrum, apply water level deconvolution technique, and bandpass filter before inverse transforming the result to obtain the RSTF. I compare the source processes of earthquakes within the area to determine stress drop differences to determine their relation with the tectonic setting of the earthquakes location. Results show an consistency with previous results, stress drop independent of moment implying self-similarity, correlation of stress drop with tectonic regime, stress drop independent of depth, stress drop depends of focal mechanism where strike-slip present larger stress drops, and decreasing stress drop as function of time. I determine seismic wave attenuation in the central western United States using coda waves. I select approximately 40 moderate earthquakes (magnitude between 5.5 and 6.5) located alocated along the California-Baja California, California-Nevada, Eastern Idaho, Gulf of California, Hebgen Lake, Montana, Nevada, New Mexico, off coast of Northern California, off coast of Oregon, southern California, southern Illinois, Vancouver Island, Washington, and Wyoming regions. These events were recorded by the EarthScope transportable array (TA) network from 2005 to 2009. We obtain the data from the Incorporated Research Institutions for Seismology (IRIS). In this study we implement a method based on the assumption that coda waves are single backscattered waves from randomly distributed heterogeneities to calculate the coda Q. The frequencies studied lie between 1 and 15 Hz. The scattering attenuation is calculated for frequency bands centered

  16. The Non-Regularity of Earthquake Recurrence in California: Lessons From Long Paleoseismic Records in Simple vs Complex Fault Regions (Invited)

    Science.gov (United States)

    Rockwell, T. K.

    2010-12-01

    A long paleoseismic record at Hog Lake on the central San Jacinto fault (SJF) in southern California documents evidence for 18 surface ruptures in the past 3.8-4 ka. This yields a long-term recurrence interval of about 210 years, consistent with its slip rate of ~16 mm/yr and field observations of 3-4 m of displacement per event. However, during the past 3800 years, the fault has switched from a quasi-periodic mode of earthquake production, during which the recurrence interval is similar to the long-term average, to clustered behavior with the inter-event periods as short as a few decades. There are also some periods as long as 450 years during which there were no surface ruptures, and these periods are commonly followed by one to several closely-timed ruptures. The coefficient of variation (CV) for the timing of these earthquakes is about 0.6 for the past 4000 years (17 intervals). Similar behavior has been observed on the San Andreas Fault (SAF) south of the Transverse Ranges where clusters of earthquakes have been followed by periods of lower seismic production, and the CV is as high as 0.7 for some portions of the fault. In contrast, the central North Anatolian Fault (NAF) in Turkey, which ruptured in 1944, appears to have produced ruptures with similar displacement at fairly regular intervals for the past 1600 years. With a CV of 0.16 for timing, and close to 0.1 for displacement, the 1944 rupture segment near Gerede appears to have been both periodic and characteristic. The SJF and SAF are part of a broad plate boundary system with multiple parallel strands with significant slip rates. Additional faults lay to the east (Eastern California shear zone) and west (faults of the LA basin and southern California Borderland), which makes the southern SAF system a complex and broad plate boundary zone. In comparison, the 1944 rupture section of the NAF is simple, straight and highly localized, which contrasts with the complex system of parallel faults in southern

  17. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  18. The pathway to earthquake early warning in the US

    Science.gov (United States)

    Allen, R. M.; Given, D. D.; Heaton, T. H.; Vidale, J. E.; West Coast Earthquake Early Warning Development Team

    2013-05-01

    The development of earthquake early warning capabilities in the United States is now accelerating and expanding as the technical capability to provide warning is demonstrated and additional funding resources are making it possible to expand the current testing region to the entire west coast (California, Oregon and Washington). Over the course of the next two years we plan to build a prototype system that will provide a blueprint for a full public system in the US. California currently has a demonstrations warning system, ShakeAlert, that provides alerts to a group of test users from the public and private sector. These include biotech companies, technology companies, the entertainment industry, the transportation sector, and the emergency planning and response community. Most groups are currently in an evaluation mode, receiving the alerts and developing protocols for future response. The Bay Area Rapid Transit (BART) system is the one group who has now implemented an automated response to the warning system. BART now stops trains when an earthquake of sufficient size is detected. Research and development also continues to develop improved early warning algorithms to better predict the distribution of shaking in large earthquakes when the finiteness of the source becomes important. The algorithms under development include the use of both seismic and GPS instrumentation and integration with existing point source algorithms. At the same time, initial testing and development of algorithms in and for the Pacific Northwest is underway. In this presentation we will review the current status of the systems, highlight the new research developments, and lay out a pathway to a full public system for the US west coast. The research and development described is ongoing at Caltech, UC Berkeley, University of Washington, ETH Zurich, Southern California Earthquake Center, and the US Geological Survey, and is funded by the Gordon and Betty Moore Foundation and the US Geological

  19. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California

    Science.gov (United States)

    Taira, Taka’aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes. PMID:29326977

  20. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California.

    Science.gov (United States)

    Taira, Taka'aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes.

  1. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  2. Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 1, Main report

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1995-11-01

    Since 1982, there has been a major effort expended to evaluate the susceptibility of nuclear Power plant equipment to failure and significant damage during seismic events. This was done by making use of data on the performance of electrical and mechanical equipment in conventional power plants and other similar industrial facilities during strong motion earthquakes. This report is intended as an extension of the seismic experience data collection effort and a compilation of experience data specific to power plant piping and supports designed and constructed US power piping code requirements which have experienced strong motion earthquakes. Eight damaging (Richter Magnitude 7.7 to 5.5) California earthquakes and their effects on 8 power generating facilities in use natural gas and California were reviewed. All of these facilities were visited and evaluated. Seven fossel-fueled (dual use natural gas and oil) and one nuclear fueled plants consisting of a total of 36 individual boiler or reactor units were investigated. Peak horizontal ground accelerations that either had been recorded on site at these facilities or were considered applicable to these power plants on the basis of nearby recordings ranged between 0.20g and 0.5lg with strong motion durations which varied from 3.5 to 15 seconds. Most US nuclear power plants are designed for a safe shutdown earthquake peak ground acceleration equal to 0.20g or less with strong motion durations which vary from 10 to 15 seconds

  3. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  4. Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California

    Science.gov (United States)

    McGarr, Arthur F.; Boettcher, M.; Fletcher, Jon Peter B.; Sell, Russell; Johnston, Malcolm J.; Durrheim, R.; Spottiswoode, S.; Milev, A.

    2009-01-01

    For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter , where  is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6.3 m/sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4.0 m/sec.

  5. Multifractal Omori law for earthquake triggering: new tests on the California, Japan and worldwide catalogues

    Science.gov (United States)

    Ouillon, G.; Sornette, D.; Ribeiro, E.

    2009-07-01

    The Multifractal Stress-Activated model is a statistical model of triggered seismicity based on mechanical and thermodynamic principles. It predicts that, above a triggering magnitude cut-off M0, the exponent p of the Omori law for the time decay of the rate of aftershocks is a linear increasing function p(M) = a0M + b0 of the main shock magnitude M. We previously reported empirical support for this prediction, using the Southern California Earthquake Center (SCEC) catalogue. Here, we confirm this observation using an updated, longer version of the same catalogue, as well as new methods to estimate p. One of this methods is the newly defined Scaling Function Analysis (SFA), adapted from the wavelet transform. This method is able to measure a mathematical singularity (hence a p-value), erasing the possible regular part of a time-series. The SFA also proves particularly efficient to reveal the coexistence and superposition of several types of relaxation laws (typical Omori sequences and short-lived swarms sequences) which can be mixed within the same catalogue. Another new method consists in monitoring the largest aftershock magnitude observed in successive time intervals, and thus shortcuts the problem of missing events with small magnitudes in aftershock catalogues. The same methods are used on data from the worldwide Harvard Centroid Moment Tensor (CMT) catalogue and show results compatible with those of Southern California. For the Japan Meteorological Agency (JMA) catalogue, we still observe a linear dependence of p on M, but with a smaller slope. The SFA shows however that results for this catalogue may be biased by numerous swarm sequences, despite our efforts to remove them before the analysis.

  6. Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 2, Appendices

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1995-11-01

    Volume 2 of the ''Survey of Strong Motion Earthquake Effects on Thermal Power Plants in California with Emphasis on Piping Systems'' contains Appendices which detail the detail design and seismic response of several power plants subjected to strong motion earthquakes. The particular plants considered include the Ormond Beach, Long Beach and Seal Beach, Burbank, El Centro, Glendale, Humboldt Bay, Kem Valley, Pasadena and Valley power plants. Included is a typical power plant piping specification and photographs of typical power plant piping specification and photographs of typical piping and support installations for the plants surveyed. Detailed piping support spacing data are also included

  7. Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  8. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    Science.gov (United States)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  9. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    Science.gov (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives

  10. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  11. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  12. EFFECTS OF THE 1983 COALINGA, CALIFORNIA, EARTHQUAKE ONCREEP ALONG THE SAN ADREAS FAULT.

    Science.gov (United States)

    Mavko, Gerald M.; Schulz, Sandra; Brown, Beth D.

    1985-01-01

    The M//L approximately equals 6. 5 earthquake that occurred near Coalinga, California, on May 2, 1983 induced changes in near-surface fault slip along the San Andreas fault. Coseismic steps were observed by creepmeters along a 200-km section of the San Andreas. some of the larger aftershocks induced additional steps, both right-lateral and left-lateral, and in general the sequence disrupted observed creep at several sites from preseismic long-term patterns. Static dislocation models can approximately explain the magnitudes and distribution of the larger coseismic steps on May 2. The smaller, more distant steps appear to be the abrupt release of accumulated slip, triggered by the coseismic strain changes, but independent of the strain change amplitudes.

  13. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    Science.gov (United States)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  14. A comparison among observations and earthquake simulator results for the allcal2 California fault model

    Science.gov (United States)

    Tullis, Terry. E.; Richards-Dinger, Keith B.; Barall, Michael; Dieterich, James H.; Field, Edward H.; Heien, Eric M.; Kellogg, Louise; Pollitz, Fred F.; Rundle, John B.; Sachs, Michael K.; Turcotte, Donald L.; Ward, Steven N.; Yikilmaz, M. Burak

    2012-01-01

    In order to understand earthquake hazards we would ideally have a statistical description of earthquakes for tens of thousands of years. Unfortunately the ∼100‐year instrumental, several 100‐year historical, and few 1000‐year paleoseismological records are woefully inadequate to provide a statistically significant record. Physics‐based earthquake simulators can generate arbitrarily long histories of earthquakes; thus they can provide a statistically meaningful history of simulated earthquakes. The question is, how realistic are these simulated histories? This purpose of this paper is to begin to answer that question. We compare the results between different simulators and with information that is known from the limited instrumental, historic, and paleoseismological data.As expected, the results from all the simulators show that the observational record is too short to properly represent the system behavior; therefore, although tests of the simulators against the limited observations are necessary, they are not a sufficient test of the simulators’ realism. The simulators appear to pass this necessary test. In addition, the physics‐based simulators show similar behavior even though there are large differences in the methodology. This suggests that they represent realistic behavior. Different assumptions concerning the constitutive properties of the faults do result in enhanced capabilities of some simulators. However, it appears that the similar behavior of the different simulators may result from the fault‐system geometry, slip rates, and assumed strength drops, along with the shared physics of stress transfer.This paper describes the results of running four earthquake simulators that are described elsewhere in this issue of Seismological Research Letters. The simulators ALLCAL (Ward, 2012), VIRTCAL (Sachs et al., 2012), RSQSim (Richards‐Dinger and Dieterich, 2012), and ViscoSim (Pollitz, 2012) were run on our most recent all‐California fault

  15. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  16. Environmental Assessment for the California Space Center at Vandenberg Air Force Base, California

    Science.gov (United States)

    2010-04-08

    shallow- rooted , mesophylic plant species that Chapter 3. Affected Environment Final Draft Environmental Assessment - California Space Center...buckwheat flowers and buds where the larvae feed until maturation. Upon maturation larvae burrow into the soil and pupate, usually within the root and...terrain, sharp or protruding objects, slippery soils or mud, and biological hazards including vegetation (i.e. poison oak and stinging nettle

  17. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    Science.gov (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  18. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  19. Preliminary Results on Earthquake Recurrence Intervals, Rupture Segmentation, and Potential Earthquake Moment Magnitudes along the Tahoe-Sierra Frontal Fault Zone, Lake Tahoe, California

    Science.gov (United States)

    Howle, J.; Bawden, G. W.; Schweickert, R. A.; Hunter, L. E.; Rose, R.

    2012-12-01

    Utilizing high-resolution bare-earth LiDAR topography, field observations, and earlier results of Howle et al. (2012), we estimate latest Pleistocene/Holocene earthquake-recurrence intervals, propose scenarios for earthquake-rupture segmentation, and estimate potential earthquake moment magnitudes for the Tahoe-Sierra frontal fault zone (TSFFZ), west of Lake Tahoe, California. We have developed a new technique to estimate the vertical separation for the most recent and the previous ground-rupturing earthquakes at five sites along the Echo Peak and Mt. Tallac segments of the TSFFZ. At these sites are fault scarps with two bevels separated by an inflection point (compound fault scarps), indicating that the cumulative vertical separation (VS) across the scarp resulted from two events. This technique, modified from the modeling methods of Howle et al. (2012), uses the far-field plunge of the best-fit footwall vector and the fault-scarp morphology from high-resolution LiDAR profiles to estimate the per-event VS. From this data, we conclude that the adjacent and overlapping Echo Peak and Mt. Tallac segments have ruptured coseismically twice during the Holocene. The right-stepping, en echelon range-front segments of the TSFFZ show progressively greater VS rates and shorter earthquake-recurrence intervals from southeast to northwest. Our preliminary estimates suggest latest Pleistocene/ Holocene earthquake-recurrence intervals of 4.8±0.9x103 years for a coseismic rupture of the Echo Peak and Mt. Tallac segments, located at the southeastern end of the TSFFZ. For the Rubicon Peak segment, northwest of the Echo Peak and Mt. Tallac segments, our preliminary estimate of the maximum earthquake-recurrence interval is 2.8±1.0x103 years, based on data from two sites. The correspondence between high VS rates and short recurrence intervals suggests that earthquake sequences along the TSFFZ may initiate in the northwest part of the zone and then occur to the southeast with a lower

  20. The California Hazards Institute

    Science.gov (United States)

    Rundle, J. B.; Kellogg, L. H.; Turcotte, D. L.

    2006-12-01

    California's abundant resources are linked with its natural hazards. Earthquakes, landslides, wildfires, floods, tsunamis, volcanic eruptions, severe storms, fires, and droughts afflict the state regularly. These events have the potential to become great disasters, like the San Francisco earthquake and fire of 1906, that overwhelm the capacity of society to respond. At such times, the fabric of civic life is frayed, political leadership is tested, economic losses can dwarf available resources, and full recovery can take decades. A patchwork of Federal, state and local programs are in place to address individual hazards, but California lacks effective coordination to forecast, prevent, prepare for, mitigate, respond to, and recover from, the harmful effects of natural disasters. Moreover, we do not know enough about the frequency, size, time, or locations where they may strike, nor about how the natural environment and man-made structures would respond. As California's population grows and becomes more interdependent, even moderate events have the potential to trigger catastrophes. Natural hazards need not become natural disasters if they are addressed proactively and effectively, rather than reactively. The University of California, with 10 campuses distributed across the state, has world-class faculty and students engaged in research and education in all fields of direct relevance to hazards. For that reason, the UC can become a world leader in anticipating and managing natural hazards in order to prevent loss of life and property and degradation of environmental quality. The University of California, Office of the President, has therefore established a new system-wide Multicampus Research Project, the California Hazards Institute (CHI), as a mechanism to research innovative, effective solutions for California. The CHI will build on the rich intellectual capital and expertise of the Golden State to provide the best available science, knowledge and tools for

  1. Tilt Precursors before Earthquakes on the San Andreas Fault, California.

    Science.gov (United States)

    Johnston, M J; Mortensen, C E

    1974-12-13

    An array of 14 biaxial shallow-borehole tiltmeters (at 1O(-7) radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (> 10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  2. California quake assessed

    Science.gov (United States)

    Wuethrich, Bernice

    On January 17, at 4:31 A.M., a 6.6 magnitude earthquake hit the Los Angeles area, crippling much of the local infrastructure and claiming 51 lives. Members of the Southern California Earthquake Network, a consortium of scientists at universities and the United States Geological Survey (USGS), entered a controlled crisis mode. Network scientists, including David Wald, Susan Hough, Kerry Sieh, and a half dozen others went into the field to gather information on the earthquake, which apparently ruptured an unmapped fault.

  3. Kinematics of the 2015 San Ramon, California earthquake swarm: Implications for fault zone structure and driving mechanisms

    Science.gov (United States)

    Xue, Lian; Bürgmann, Roland; Shelly, David R.; Johnson, Christopher W.; Taira, Taka'aki

    2018-05-01

    Earthquake swarms represent a sudden increase in seismicity that may indicate a heterogeneous fault-zone, the involvement of crustal fluids and/or slow fault slip. Swarms sometimes precede major earthquake ruptures. An earthquake swarm occurred in October 2015 near San Ramon, California in an extensional right step-over region between the northern Calaveras Fault and the Concord-Mt. Diablo fault zone, which has hosted ten major swarms since 1970. The 2015 San Ramon swarm is examined here from 11 October through 18 November using template matching analysis. The relocated seismicity catalog contains ∼4000 events with magnitudes between - 0.2

  4. Foreshock occurrence before large earthquakes

    Science.gov (United States)

    Reasenberg, P.A.

    1999-01-01

    Rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured in two worldwide catalogs over ???20-year intervals. The overall rates observed are similar to ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering based on patterns of small and moderate aftershocks in California. The aftershock model was extended to the case of moderate foreshocks preceding large mainshocks. Overall, the observed worldwide foreshock rates exceed the extended California generic model by a factor of ???2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events, a large majority, composed of events located in shallow subduction zones, had a high foreshock rate, while a minority, located in continental thrust belts, had a low rate. These differences may explain why previous surveys have found low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggests the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich. If this is so, then the California generic model may significantly underestimate the conditional probability for a very large (M ??? 8) earthquake following a potential (M ??? 7) foreshock in Cascadia. The magnitude differences among the identified foreshock-mainshock pairs in the Harvard catalog are consistent with a uniform

  5. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    Science.gov (United States)

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business

  6. Changes in state of stress on the southern san andreas fault resulting from the california earthquake sequence of april to june 1992.

    Science.gov (United States)

    Jaumé, S C; Sykes, L R

    1992-11-20

    The April to June 1992 Landers earthquake sequence in southern California modified the state of stress along nearby segments of the San Andreas fault, causing a 50-kilometer segment of the fault to move significantly closer to failure where it passes through a compressional bend near San Gorgonio Pass. The decrease in compressive normal stress may also have reduced fluid pressures along that fault segment. As pressures are reequilibrated by diffusion, that fault segment should move closer to failure with time. That fault segment and another to the southeast probably have not ruptured in a great earthquake in about 300 years.

  7. Earthquake early Warning ShakeAlert system: West coast wide production prototype

    Science.gov (United States)

    Kohler, Monica D.; Cochran, Elizabeth S.; Given, Douglas; Guiwits, Stephen; Neuhauser, Doug; Hensen, Ivan; Hartog, Renate; Bodin, Paul; Kress, Victor; Thompson, Stephen; Felizardo, Claude; Brody, Jeff; Bhadha, Rayo; Schwarz, Stan

    2017-01-01

    Earthquake early warning (EEW) is an application of seismological science that can give people, as well as mechanical and electrical systems, up to tens of seconds to take protective actions before peak earthquake shaking arrives at a location. Since 2006, the U.S. Geological Survey has been working in collaboration with several partners to develop EEW for the United States. The goal is to create and operate an EEW system, called ShakeAlert, for the highest risk areas of the United States, starting with the West Coast states of California, Oregon, and Washington. In early 2016, the Production Prototype v.1.0 was established for California; then, in early 2017, v.1.2 was established for the West Coast, with earthquake notifications being distributed to a group of beta users in California, Oregon, and Washington. The new ShakeAlert Production Prototype was an outgrowth from an earlier demonstration EEW system that began sending test notifications to selected users in California in January 2012. ShakeAlert leverages the considerable physical, technical, and organizational earthquake monitoring infrastructure of the Advanced National Seismic System, a nationwide federation of cooperating seismic networks. When fully implemented, the ShakeAlert system may reduce damage and injury caused by large earthquakes, improve the nation’s resilience, and speed recovery.

  8. The 2010 M w 7.2 El Mayor-Cucapah Earthquake Sequence, Baja California, Mexico and Southernmost California, USA: Active Seismotectonics along the Mexican Pacific Margin

    Science.gov (United States)

    Hauksson, Egill; Stock, Joann; Hutton, Kate; Yang, Wenzheng; Vidal-Villegas, J. Antonio; Kanamori, Hiroo

    2011-08-01

    The El Mayor-Cucapah earthquake sequence started with a few foreshocks in March 2010, and a second sequence of 15 foreshocks of M > 2 (up to M4.4) that occurred during the 24 h preceding the mainshock. The foreshocks occurred along a north-south trend near the mainshock epicenter. The M w 7.2 mainshock on April 4 exhibited complex faulting, possibly starting with a ~M6 normal faulting event, followed ~15 s later by the main event, which included simultaneous normal and right-lateral strike-slip faulting. The aftershock zone extends for 120 km from the south end of the Elsinore fault zone north of the US-Mexico border almost to the northern tip of the Gulf of California. The waveform-relocated aftershocks form two abutting clusters, each about 50 km long, as well as a 10 km north-south aftershock zone just north of the epicenter of the mainshock. Even though the Baja California data are included, the magnitude of completeness and the hypocentral errors increase gradually with distance south of the international border. The spatial distribution of large aftershocks is asymmetric with five M5+ aftershocks located to the south of the mainshock, and only one M5.7 aftershock, but numerous smaller aftershocks to the north. Further, the northwest aftershock cluster exhibits complex faulting on both northwest and northeast planes. Thus, the aftershocks also express a complex pattern of stress release along strike. The overall rate of decay of the aftershocks is similar to the rate of decay of a generic California aftershock sequence. In addition, some triggered seismicity was recorded along the Elsinore and San Jacinto faults to the north, but significant northward migration of aftershocks has not occurred. The synthesis of the El Mayor-Cucapah sequence reveals transtensional regional tectonics, including the westward growth of the Mexicali Valley and the transfer of Pacific-North America plate motion from the Gulf of California in the south into the southernmost San

  9. Data Delivery Latency Improvements And First Steps Towards The Distributed Computing Of The Caltech/USGS Southern California Seismic Network Earthquake Early Warning System

    Science.gov (United States)

    Stubailo, I.; Watkins, M.; Devora, A.; Bhadha, R. J.; Hauksson, E.; Thomas, V. I.

    2016-12-01

    The USGS/Caltech Southern California Seismic Network (SCSN) is a modern digital ground motion seismic network. It develops and maintains Earthquake Early Warning (EEW) data collection and delivery systems in southern California as well as real-time EEW algorithms. Recently, Behr et al., SRL, 2016 analyzed data from several regional seismic networks deployed around the globe. They showed that the SCSN was the network with the smallest data communication delays or latency. Since then, we have reduced further the telemetry delays for many of the 330 current sites. The latency has been reduced on average from 2-6 sec to 0.4 seconds by tuning the datalogger parameters and/or deploying software upgrades. Recognizing the latency data as one of the crucial parameters in EEW, we have started archiving the per-packet latencies in mseed format for all the participating sites in a similar way it is traditionally done for the seismic waveform data. The archived latency values enable us to understand and document long-term changes in performance of the telemetry links. We can also retroactively investigate how latent the waveform data were during a specific event or during a specific time period. In addition the near-real time latency values are useful for monitoring and displaying the real-time station latency, in particular to compare different telemetry technologies. A future step to reduce the latency is to deploy the algorithms on the dataloggers at the seismic stations and transmit either the final solutions or intermediate parameters to a central processing center. To implement this approach, we are developing a stand-alone version of the OnSite algorithm to run on the dataloggers in the field. This will increase the resiliency of the SCSN to potential telemetry restrictions in the immediate aftermath of a large earthquake, either by allowing local alarming by the single station, or permitting transmission of lightweight parametric information rather than continuous

  10. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  11. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  12. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS)

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center

    2010-12-01

    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  13. Foreshock occurrence rates before large earthquakes worldwide

    Science.gov (United States)

    Reasenberg, P.A.

    1999-01-01

    Global rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured, using earthquakes listed in the Harvard CMT catalog for the period 1978-1996. These rates are similar to rates ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering, which is based on patterns of small and moderate aftershocks in California, and were found to exceed the California model by a factor of approximately 2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events a large majority, composed of events located in shallow subduction zones, registered a high foreshock rate, while a minority, located in continental thrust belts, measured a low rate. These differences may explain why previous surveys have revealed low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggest the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich.

  14. The SAFRR Tsunami Scenario: Improving Resilience for California from a Plausible M9 Earthquake near the Alaska Peninsula

    Science.gov (United States)

    Ross, S.; Jones, L.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J.; Geist, E. L.; Johnson, L.; Kirby, S. H.; Knight, W.; Long, K.; Lynett, P. J.; Miller, K.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Plumlee, G. S.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E.; Thio, H. K.; Titov, V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2013-12-01

    The SAFRR Tsunami Scenario models a hypothetical but plausible tsunami, created by an Mw9.1 earthquake occurring offshore from the Alaskan peninsula, and its impacts on the California coast. We present the likely inundation areas, current velocities in key ports and harbors, physical damage and repair costs, economic consequences, environmental impacts, social vulnerability, emergency management, and policy implications for California associated with the tsunami scenario. The intended users are those who must make mitigation decisions before and rapid decisions during future tsunamis. Around a half million people would be present in the scenario's inundation area in residences, businesses, public venues, parks and beaches. Evacuation would likely be ordered for the State of California's maximum mapped tsunami inundation zone, evacuating an additional quarter million people from residences and businesses. Some island and peninsula communities would face particular evacuation challenges because of limited access options and short warning time, caused by the distance between Alaska and California. Evacuations may also be a challenge for certain dependent-care populations. One third of the boats in California's marinas could be damaged or sunk, costing at least 700 million in repairs to boats and docks, and potentially much more to address serious issues due to sediment transport and environmental contamination. Fires would likely start at many sites where fuel and petrochemicals are stored in ports and marinas. Tsunami surges and bores may travel several miles inland up coastal rivers. Debris clean-up and recovery of inundated and damaged areas will take days, months, or years depending on the severity of impacts and the available resources for recovery. The Ports of Los Angeles and Long Beach (POLA/LB) would be shut down for a miniμm of two days due to strong currents. Inundation of dry land in the ports would result in 100 million damages to cargo and additional

  15. 3-D P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    Science.gov (United States)

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-01-01

    To refine the 3-D seismic velocity model in the greater Parkfield, California region, a new data set including regular earthquakes, shots, quarry blasts and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time–frequency domain phase weighted stacking method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behaviour. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new autopicker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. We relocate LFE families and analyse the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  16. Earthquakes and faults in the San Francisco Bay area (1970-2003)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.; Wong, Florence L.; Saucedo, George J.

    2004-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.0 in the greater San Francisco Bay area. Twenty-two earthquakes magnitude 5.0 and greater are indicated on the map and listed chronologically in an accompanying table. The data are compiled from records from 1970-2003. The bathymetry was generated from a digital version of NOAA maps and hydrogeographic data for San Francisco Bay. Elevation data are from the USGS National Elevation Database. Landsat satellite image is from seven Landsat 7 Enhanced Thematic Mapper Plus scenes. Fault data are reproduced with permission from the California Geological Survey. The earthquake data are from the Northern California Earthquake Catalog.

  17. Earthquake Preparedness and Education: A Collective Impact Approach to Improving Awareness and Resiliency

    Science.gov (United States)

    Benthien, M. L.; Wood, M. M.; Ballmann, J. E.; DeGroot, R. M.

    2017-12-01

    The Southern California Earthquake Center (SCEC), headquartered at the University of Southern California, is a collaboration of more than 1000 scientists and students from 70+ institutions. SCEC's Communication, Education, and Outreach (CEO) program translates earthquake science into products and activities in order to increase scientific literacy, develop a diverse scientific workforce, and reduce earthquake risk to life and property. SCEC CEO staff coordinate these efforts through partnership collaborations it has established to engage subject matter experts, reduce duplication of effort, and achieve greater results. Several of SCEC's collaborative networks began within Southern California and have since grown statewide (Earthquake Country Alliance, a public-private-grassroots partnership), national ("EPIcenter" Network of museums, parks, libraries, etc.), and international (Great ShakeOut Earthquake Drills with millions of participants each year). These networks have benefitted greatly from partnerships with national (FEMA), state, and local emergency managers. Other activities leverage SCEC's networks in new ways and with national earth science organizations, such as the EarthConnections Program (with IRIS, NAGT, and many others), Quake Catcher Network (with IRIS) and the GeoHazards Messaging Collaboratory (with IRIS, UNAVCO, and USGS). Each of these partnerships share a commitment to service, collaborative development, and the application of research (including social science theory for motivating preparedness behaviors). SCEC CEO is developing new evaluative structures and adapting the Collective Impact framework to better understand what has worked well or what can be improved, according to the framework's five key elements: create a common agenda; share common indicators and measurement; engage diverse stakeholders to coordinate mutually reinforcing activities; initiate continuous communication; and provide "backbone" support. This presentation will provide

  18. Impact of the May 12, 2008, Earthquake on blood donations across five Chinese blood centers.

    Science.gov (United States)

    Liu, Jing; Huang, Yi; Wang, Jingxing; Bi, Xinhong; Li, Julin; Lu, Yunlai; Wen, Xiuqiong; Yao, Fuzhu; Dong, Xiangdong; He, Weilan; Huang, Mei; Ma, Hongli; Mei, Heili; King, Melissa; Wright, David J; Ness, Paul M; Shan, Hua

    2010-09-01

    On May 12, 2008, a severe earthquake struck China's Sichuan Province. The nationwide outpouring of charity resulted in a surge of subsequent blood donations. The quantity and quality of these donations were examined in comparison with routine donations. Whole blood and apheresis donations from five geographically different blood centers collected within 1 week postearthquake were compared with those collected during the rest of the year. Regional differences, demographic characteristics, first-time and repeat donor status, and infectious disease screening markers associated with these donations were compared by earthquake status using chi-square statistics. Poisson regression analysis examined the number of daily donations by earthquake status after adjusting for center, day of week, and seasonal variations. The number of daily donations across five blood centers increased from 685 on a typical day to 1151 in the postearthquake week. The surge was observed in both sexes and across different education levels, age, and ethnicity groups and three blood centers and was significant after adjusting for confounding covariates. The influx of first-time donors (89.5%) was higher than that of repeat donors (34%). There was a significant change in the overall screening reactive marker rates excluding alanine aminotransferase (2.06% vs. 1.72%% vs. 4.96%). However, when the individual screening test was analyzed separately, no significant differences were found. Timely donations in response to a disaster are crucial to ensure emergency blood transfusion. The dramatically increased postearthquake donations suggest that Chinese blood centers are capable of handling emergency blood needs. Measures to maintain blood safety should be taken in times of emergency. © 2010 American Association of Blood Banks.

  19. Comparison of four moderate-size earthquakes in southern California using seismology and InSAR

    Science.gov (United States)

    Mellors, R.J.; Magistrale, H.; Earle, P.; Cogbill, A.H.

    2004-01-01

    Source parameters determined from interferometric synthetic aperture radar (InSAR) measurements and from seismic data are compared from four moderate-size (less than M 6) earthquakes in southern California. The goal is to verify approximate detection capabilities of InSAR, assess differences in the results, and test how the two results can be reconciled. First, we calculated the expected surface deformation from all earthquakes greater than magnitude 4 in areas with available InSAR data (347 events). A search for deformation from the events in the interferograms yielded four possible events with magnitudes less than 6. The search for deformation was based on a visual inspection as well as cross-correlation in two dimensions between the measured signal and the expected signal. A grid-search algorithm was then used to estimate focal mechanism and depth from the InSAR data. The results were compared with locations and focal mechanisms from published catalogs. An independent relocation using seismic data was also performed. The seismic locations fell within the area of the expected rupture zone for the three events that show clear surface deformation. Therefore, the technique shows the capability to resolve locations with high accuracy and is applicable worldwide. The depths determined by InSAR agree with well-constrained seismic locations determined in a 3D velocity model. Depth control for well-imaged shallow events using InSAR data is good, and better than the seismic constraints in some cases. A major difficulty for InSAR analysis is the poor temporal coverage of InSAR data, which may make it impossible to distinguish deformation due to different earthquakes at the same location.

  20. From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2009-12-01

    Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.

  1. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    Science.gov (United States)

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  2. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  3. A prototype operational earthquake loss model for California based on UCERF3-ETAS – A first look at valuation

    Science.gov (United States)

    Field, Edward; Porter, Keith; Milner, Kevn

    2017-01-01

    We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

  4. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  5. Seasonal water storage, stress modulation and California seismicity

    Science.gov (United States)

    Johnson, C. W.; Burgmann, R.; Fu, Y.

    2017-12-01

    Establishing what controls the timing of earthquakes is fundamental to understanding the nature of the earthquake cycle and critical to determining time-dependent earthquake hazard. Seasonal loading provides a natural laboratory to explore the crustal response to a quantifiable transient force. In California, the accumulation of winter snowpack in the Sierra Nevada, surface water in lakes and reservoirs, and groundwater in sedimentary basins follow the annual cycle of wet winters and dry summers. The surface loads resulting from the seasonal changes in water storage produce elastic deformation of the Earth's crust. We used 9 years of global positioning system (GPS) vertical deformation time series to constrain models of monthly hydrospheric loading and the resulting stress changes on fault planes of small earthquakes. Previous studies posit that temperature, atmospheric pressure, or hydrologic changes may strain the lithosphere and promote additional earthquakes above background levels. Depending on fault geometry, the addition or removal of water increases the Coulomb failure stress. The largest stress amplitudes are occurring on dipping reverse faults in the Coast Ranges and along the eastern Sierra Nevada range front. We analyze 9 years of M≥2.0 earthquakes with known focal mechanisms in northern and central California to resolve fault-normal and fault-shear stresses for the focal geometry. Our results reveal 10% more earthquakes occurring during slip-encouraging fault-shear stress conditions and suggest that earthquake populations are modulated at periods of natural loading cycles, which promote failure by stress changes on the order of 1-5 kPa. We infer that California seismicity rates are modestly modulated by natural hydrological loading cycles.

  6. Earthquakes in Action: Incorporating Multimedia, Internet Resources, Large-scale Seismic Data, and 3-D Visualizations into Innovative Activities and Research Projects for Today's High School Students

    Science.gov (United States)

    Smith-Konter, B.; Jacobs, A.; Lawrence, K.; Kilb, D.

    2006-12-01

    The most effective means of communicating science to today's "high-tech" students is through the use of visually attractive and animated lessons, hands-on activities, and interactive Internet-based exercises. To address these needs, we have developed Earthquakes in Action, a summer high school enrichment course offered through the California State Summer School for Mathematics and Science (COSMOS) Program at the University of California, San Diego. The summer course consists of classroom lectures, lab experiments, and a final research project designed to foster geophysical innovations, technological inquiries, and effective scientific communication (http://topex.ucsd.edu/cosmos/earthquakes). Course content includes lessons on plate tectonics, seismic wave behavior, seismometer construction, fault characteristics, California seismicity, global seismic hazards, earthquake stress triggering, tsunami generation, and geodetic measurements of the Earth's crust. Students are introduced to these topics through lectures-made-fun using a range of multimedia, including computer animations, videos, and interactive 3-D visualizations. These lessons are further enforced through both hands-on lab experiments and computer-based exercises. Lab experiments included building hand-held seismometers, simulating the frictional behavior of faults using bricks and sandpaper, simulating tsunami generation in a mini-wave pool, and using the Internet to collect global earthquake data on a daily basis and map earthquake locations using a large classroom map. Students also use Internet resources like Google Earth and UNAVCO/EarthScope's Jules Verne Voyager Jr. interactive mapping tool to study Earth Science on a global scale. All computer-based exercises and experiments developed for Earthquakes in Action have been distributed to teachers participating in the 2006 Earthquake Education Workshop, hosted by the Visualization Center at Scripps Institution of Oceanography (http

  7. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  8. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  9. Calculation of earthquake rupture histories using a hybrid global search algorithm: Application to the 1992 Landers, California, earthquake

    Science.gov (United States)

    Hartzell, S.; Liu, P.

    1996-01-01

    A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.

  10. Seismic Response and Evaluation of SDOF Self-Centering Friction Damping Braces Subjected to Several Earthquake Ground Motions

    Directory of Open Access Journals (Sweden)

    Jong Wan Hu

    2015-01-01

    Full Text Available This paper mainly deals with seismic response and performance for self-centering friction damping braces (SFDBs subjected to several maximum- or design-leveled earthquake ground motions. The self-centering friction damping brace members consist of core recentering components fabricated with superelastic shape memory alloy wires and energy dissipation devices achieved through shear friction mechanism. As compared to the conventional brace members for use in the steel concentrically braced frame structure, these self-centering friction damping brace members make the best use of their representative characteristics to minimize residual deformations and to withstand earthquake loads without member replacement. The configuration and response mechanism of self-centering friction damping brace systems are firstly described in this study, and then parametric investigations are conducted through nonlinear time-history analyses performed on numerical single degree-of-freedom spring models. After observing analysis results, adequate design methodologies that optimally account for recentering capability and energy dissipation according to their comparative parameters are intended to be suggested in order to take advantage of energy capacity and to minimize residual deformation simultaneously.

  11. Forging successful academic-community partnerships with community health centers: the California statewide Area Health Education Center (AHEC) experience.

    Science.gov (United States)

    Fowkes, Virginia; Blossom, H John; Mitchell, Brenda; Herrera-Mata, Lydia

    2014-01-01

    Increased access to insurance under the Affordable Care Act will increase demands for clinical services in community health centers (CHCs). CHCs also have an increasingly important educational role to train clinicians who will remain to practice in community clinics. CHCs and Area Health Education Centers (AHECs) are logical partners to prepare the health workforce for the future. Both are sponsored by the Health Resources and Services Administration, and they share a mission to improve quality of care in medically underserved communities. AHECs emphasize the educational side of the mission, and CHCs the service side. Building stronger partnerships between them can facilitate a balance between education and service needs.From 2004 to 2011, the California Statewide AHEC program and its 12 community AHECs (centers) reorganized to align training with CHC workforce priorities. Eight centers merged into CHC consortia; others established close partnerships with CHCs in their respective regions. The authors discuss issues considered and approaches taken to make these changes. Collaborative innovative processes with program leadership, staff, and center directors revised the program mission, developed common training objectives with an evaluation plan, and defined organizational, functional, and impact characteristics for successful AHECs in California. During this planning, centers gained confidence as educational arms for the safety net and began collaborations with statewide programs as well as among themselves. The AHEC reorganization and the processes used to develop, strengthen, and identify standards for centers forged the development of new partnerships and established academic-community trust in planning and implementing programs with CHCs.

  12. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  13. Epistemic uncertainty in California-wide synthetic seismicity simulations

    Science.gov (United States)

    Pollitz, Fred F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  14. Earthquake evaluation of a substation network

    International Nuclear Information System (INIS)

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  15. Earthquake Early Warning: A Prospective User's Perspective (Invited)

    Science.gov (United States)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.

    2009-12-01

    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  16. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  17. Rapid Response Products of The ARIA Project for the M6.0 August 24, 2014 South Napa Earthquake

    Science.gov (United States)

    Yun, S. H.; Owen, S. E.; Hua, H.; Milillo, P.; Fielding, E. J.; Hudnut, K. W.; Dawson, T. E.; Mccrink, T. P.; Jo, M. J.; Barnhart, W. D.; Manipon, G. J. M.; Agram, P. S.; Moore, A. W.; Jung, H. S.; Webb, F.; Milillo, G.; Rosinski, A.

    2014-12-01

    A magnitude 6.0 earthquake struck southern Napa county northeast of San Francisco, California, on Aug. 24, 2014, causing significant damage in the city of Napa and nearby areas. One day after the earthquake, the Advanced Rapid Imaging and Analysis (ARIA) team produced and released observations of coseismic ground displacement measured with continuous GPS stations of the Plate Boundary Observatory (operated by UNAVCO for the National Science Foundation) and the Bay Area Rapid Deformation network (operated by Berkeley Seismological Laboratory). Three days after the earthquake (Aug. 27), the Italian Space Agency's (ASI) COSMO-SkyMed (CSK) satellite acquired their first post-event data. On the same day, the ARIA team, in collaboration with ASI and University of Basilicata, produced and released a coseismic interferogram that revealed ground deformation and surface rupture. The depiction of the surface rupture - discontinuities of color fringes in the CSK interferogram - helped guide field geologists from the US Geological Survey and the California Geological Survey (CGS) to features that may have otherwise gone undetected. Small-scale cracks were found on a runway of the Napa County Airport, as well as bridge damage and damaged roads. ARIA's response to this event highlighted the importance of timeliness for mapping surface deformation features. ARIA's rapid response products were shared through Southern California Earthquake Center's response website and the California Earthquake Clearinghouse. A damage proxy map derived from InSAR coherence of CSK data was produced and distributed on Aug. 27. Field crews from the CGS identified true and false positives, including mobile home damage, newly planted grape vines, and a cripple wall failure of a house. Finite fault slip models constrained from CSK interferograms and continuous GPS observations reveal a north-propagating rupture with well-resolved slip from 0-10.5 km depth. We also measured along-track coseismic

  18. Quasi-periodic recurrence of large earthquakes on the southern San Andreas fault

    Science.gov (United States)

    Scharer, Katherine M.; Biasi, Glenn P.; Weldon, Ray J.; Fumal, Tom E.

    2010-01-01

    It has been 153 yr since the last large earthquake on the southern San Andreas fault (California, United States), but the average interseismic interval is only ~100 yr. If the recurrence of large earthquakes is periodic, rather than random or clustered, the length of this period is notable and would generally increase the risk estimated in probabilistic seismic hazard analyses. Unfortunately, robust characterization of a distribution describing earthquake recurrence on a single fault is limited by the brevity of most earthquake records. Here we use statistical tests on a 3000 yr combined record of 29 ground-rupturing earthquakes from Wrightwood, California. We show that earthquake recurrence there is more regular than expected from a Poisson distribution and is not clustered, leading us to conclude that recurrence is quasi-periodic. The observation of unimodal time dependence is persistent across an observationally based sensitivity analysis that critically examines alternative interpretations of the geologic record. The results support formal forecast efforts that use renewal models to estimate probabilities of future earthquakes on the southern San Andreas fault. Only four intervals (15%) from the record are longer than the present open interval, highlighting the current hazard posed by this fault.

  19. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    International Nuclear Information System (INIS)

    O'Brien, G.M.

    1993-01-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p number-sign 1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p number-sign 1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells

  20. Comparison Of Seismic Performance Of Erciş Cultural Center Building With Observed And Calculated By Turkish Earthquake Code-2007

    Directory of Open Access Journals (Sweden)

    Recep Ali Dedecan

    2013-08-01

    Full Text Available The goal of this paper is to review the validity of seismic assessment procedure given in the Turkish Earthquake Code by comparing the assessment results with real structures from Eastern Turkey, where the 2011 Van earthquake occurred. To test the analysis methods for a typically suitable structure, the cultural center building at Erciş with 3 stories, is selected. In order to compare the results of the three different analysis techniques, for an identical earthquake, the ground motion used in analysis was characterized by equivalent elastic earthquake spectra, which were developed from available time history at the nearest construction site. It was found that the damage predictions by using the by Turkish Earthquake Code procedures point out the different level of damages. But, it is concluded that nonlinear time history analysis calculated the best estimation of the damage observed in the site.

  1. Charter Schools Indicators: A Report from the Center on Educational Governance, University of Southern California. CSI-USC 2008

    Science.gov (United States)

    Center on Educational Governance, 2008

    2008-01-01

    This report, which is the second annual report on charter schools in California by the University of Southern California's (USC's) Center on Educational Governance, offers a unique view of charter school performance. Using both financial and academic data submitted by school districts to the state of California, this report looks well beyond test…

  2. Deformation from the 1989 Loma Prieta earthquake near the southwest margin of the Santa Clara Valley, California

    Science.gov (United States)

    Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.

    2014-01-01

    Damage to pavement and near-surface utility pipes, caused by the 17 October 1989, Loma Prieta earthquake, provides evidence for ground deformation in a 663 km2 area near the southwest margin of the Santa Clara Valley, California (USA). A total of 1427 damage sites, collected from more than 30 sources, are concentrated in four zones, three of which lie near previously mapped faults. In one of these zones, the channel lining of Los Gatos Creek, a 2-km-long concrete strip trending perpendicular to regional geologic structure, was broken by thrusts that were concentrated in two belts, each several tens of meters wide, separated by more than 300 m of relatively undeformed concrete.

  3. Detailed observations of California foreshock sequences: Implications for the earthquake initiation process

    Science.gov (United States)

    Dodge, D.A.; Beroza, G.C.; Ellsworth, W.L.

    1996-01-01

    We find that foreshocks provide clear evidence for an extended nucleation process before some earthquakes. In this study, we examine in detail the evolution of six California foreshock sequences, the 1986 Mount Lewis (ML, = 5.5), the 1986 Chalfant (ML = 6.4), the. 1986 Stone Canyon (ML = 4.7), the 1990 Upland (ML = 5.2), the 1992 Joshua Tree (MW= 6.1), and the 1992 Landers (MW = 7.3) sequence. Typically, uncertainties in hypocentral parameters are too large to establish the geometry of foreshock sequences and hence to understand their evolution. However, the similarity of location and focal mechanisms for the events in these sequences leads to similar foreshock waveforms that we cross correlate to obtain extremely accurate relative locations. We use these results to identify small-scale fault zone structures that could influence nucleation and to determine the stress evolution leading up to the mainshock. In general, these foreshock sequences are not compatible with a cascading failure nucleation model in which the foreshocks all occur on a single fault plane and trigger the mainshock by static stress transfer. Instead, the foreshocks seem to concentrate near structural discontinuities in the fault and may themselves be a product of an aseismic nucleation process. Fault zone heterogeneity may also be important in controlling the number of foreshocks, i.e., the stronger the heterogeneity, the greater the number of foreshocks. The size of the nucleation region, as measured by the extent of the foreshock sequence, appears to scale with mainshock moment in the same manner as determined independently by measurements of the seismic nucleation phase. We also find evidence for slip localization as predicted by some models of earthquake nucleation. Copyright 1996 by the American Geophysical Union.

  4. Status of Public Earthquake Early Warning in the U.S

    Science.gov (United States)

    Given, D. D.

    2013-12-01

    Earthquake Early Warning (EEW) is a proven use of seismological science that can give people and businesses outside the epicentral area of a large earthquake up to a minute to take protective actions before the most destructive shaking hits them. Since 2006 several organizations have been collaborating to create such a system in the United States. These groups include the US Geological Survey, Caltech, UC Berkeley, the University of Washington, the Southern California Earthquake Center, the Swiss Federal Institute of Technology, Zürich, the California Office of Emergency Services, and the California Geological Survey. A demonstration version of the system, called ShakeAlert, began sending test notifications to selected users in California in January 2012. In August 2012 San Francisco's Bay Area Rapid Transit district began slowing and stopping trains in response to strong ground shaking. The next step in the project is to progress to a production prototype for the west coast. The system is built on top of the considerable technical and organizational earthquake monitoring infrastructure of the Advanced National Seismic System (ANSS). While a fully functional, robust, public EEW system will require significant new investment and development in several major areas, modest progress is being made with current resources. First, high-quality sensors must be installed with sufficient density, particularly near source faults. Where possible, we are upgrading and augmenting the existing ANSS networks on the west coast. Second, data telemetry from those sensors must be engineered for speed and reliability. Next, robust central processing infrastructure is being designed and built. Also, computer algorithms to detect and characterize the evolving earthquake must be further developed and tested. Last year the Gordon and Betty Moore Foundation funded USGS, Caltech, UCB and UW to accelerate R&D efforts. Every available means of distributing alerts must be used to insure the

  5. New fault picture points toward San Francisco Bay area earthquakes

    Science.gov (United States)

    Kerr, R. A.

    1989-01-01

    Recent earthquakes and a new way of looking at faults suggest that damaging earthquakes are closing in on the San Francisco area. Earthquakes Awareness Week 1989 in northern California started off with a bang on Monday, 3 April, when a magnitude 4.8 earthquake struck 15 kilometers northeast of San Jose. The relatively small shock-its primary damage was the shattering of an air-control tower window-got the immediate attention of three U.S Geological Survey seismologists in Menlo Park near San Francisco. David Oppenheimer, William Bakun, and Allan Lindh had forecast a nearby earthquake in a just completed report, and this, they thought, might be it. 

  6. Impact of the Northridge earthquake on the mental health of veterans: results from a panel study.

    Science.gov (United States)

    Dobalian, Aram; Stein, Judith A; Heslin, Kevin C; Riopelle, Deborah; Venkatesh, Brinda; Lanto, Andrew B; Simon, Barbara; Yano, Elizabeth M; Rubenstein, Lisa V

    2011-09-01

    The 1994 earthquake that struck Northridge, California, led to the closure of the Veterans Health Administration Medical Center at Sepulveda. This article examines the earthquake's impact on the mental health of an existing cohort of veterans who had previously used the Sepulveda Veterans Health Administration Medical Center. From 1 to 3 months after the disaster, trained interviewers made repeated attempts to contact participants by telephone to administer a repeated measures follow-up design survey based on a survey that had been done preearthquake. Postearthquake data were obtained on 1144 of 1800 (64%) male veterans for whom there were previous data. We tested a predictive latent variable path model of the relations between sociodemographic characteristics, predisaster physical and emotional health measures, and postdisaster emotional health and perceived earthquake impact. Perceived earthquake impact was predicted by predisaster emotional distress, functional limitations, and number of health conditions. Postdisaster emotional distress was predicted by preexisting emotional distress and earthquake impact. The regression coefficient from earthquake impact to postearthquake emotional distress was larger than that of the stability coefficient from preearthquake emotional distress. Postearthquake emotional distress also was affected indirectly by preearthquake emotional distress, health conditions, younger age, and lower socioeconomic status. The postdisaster emotional health of veterans who experienced greater earthquake impact would have likely benefited from postdisaster intervention, regardless of their predisaster emotional health. Younger veterans and veterans with generally poor physical and emotional health were more vulnerable to greater postearthquake emotional distress. Veterans of lower socioeconomic status were disproportionately likely to experience more effects of the disaster because they had more predisaster emotional distress, more functional

  7. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  8. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    Science.gov (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  9. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  10. The Landers earthquake; preliminary instrumental results

    Science.gov (United States)

    Jones, L.; Mori, J.; Hauksson, E.

    1992-01-01

    Early on the morning of June 28, 1992, millions of people in southern California were awakened by the largest earthquake to occur in the western United States in the past 40 yrs. At 4:58 a.m PDT (local time), faulting associated with the magnitude 7.3 earthquake broke through to earth's surface near the town of Landers, California. the surface rupture then propagated 70km (45 mi) to the north and northwest along a band of faults passing through the middle of the Mojave Desert. Fortunately, the strongest shaking occurred in uninhabited regions of the Mojave Desert. Still one child was killed in Yucca Valley, and about 400 people were injured in the surrounding area. the desert communities of Landers, Yucca Valley, and Joshua Tree in San Bernardino Country suffered considerable damage to buildings and roads. Damage to water and power lines caused problems in many areas. 

  11. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  12. Electrical resistivity variations associated with earthquakes on the san andreas fault.

    Science.gov (United States)

    Mazzella, A; Morrison, H F

    1974-09-06

    A 24 percent precursory change in apparent electrical resistivity was observed before a magnitude 3.9 earthquake of strike-slip nature on the San Andreas fault in central California. The experimental configuration and numerical calculations suggest that the change is associated with a volume at depth rather than some near-surface phenomenon. The character and duration of the precursor period agree well with those of other earthquake studies and support a dilatant earthquake mechanism model.

  13. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  14. Earthquakes: Risk, Monitoring, Notification, and Research

    Science.gov (United States)

    2008-06-19

    States are as much as 30% lower for certain types of ground motion, called long-period seismic waves, which affect taller , multistory buildings. Ground...jump between connected faults. Earthquakes that occur along the Sierra Madre Fault in southern California, for example, could trigger a series of

  15. A record of large earthquakes during the past two millennia on the southern Green Valley Fault, California

    Science.gov (United States)

    Lienkaemper, James J.; Baldwin, John N.; Turner, Robert; Sickler, Robert R.; Brown, Johnathan

    2013-01-01

    We document evidence for surface-rupturing earthquakes (events) at two trench sites on the southern Green Valley fault, California (SGVF). The 75-80-km long dextral SGVF creeps ~1-4 mm/yr. We identify stratigraphic horizons disrupted by upward-flowering shears and in-filled fissures unlikely to have formed from creep alone. The Mason Rd site exhibits four events from ~1013 CE to the Present. The Lopes Ranch site (LR, 12 km to the south) exhibits three events from 18 BCE to Present including the most recent event (MRE), 1610 ±52 yr CE (1σ) and a two-event interval (18 BCE-238 CE) isolated by a millennium of low deposition. Using Oxcal to model the timing of the 4-event earthquake sequence from radiocarbon data and the LR MRE yields a mean recurrence interval (RI or μ) of 199 ±82 yr (1σ) and ±35 yr (standard error of the mean), the first based on geologic data. The time since the most recent earthquake (open window since MRE) is 402 yr ±52 yr, well past μ~200 yr. The shape of the probability density function (pdf) of the average RI from Oxcal resembles a Brownian Passage Time (BPT) pdf (i.e., rather than normal) that permits rarer longer ruptures potentially involving the Berryessa and Hunting Creek sections of the northernmost GVF. The model coefficient of variation (cv, σ/μ) is 0.41, but a larger value (cv ~0.6) fits better when using BPT. A BPT pdf with μ of 250 yr and cv of 0.6 yields 30-yr rupture probabilities of 20-25% versus a Poisson probability of 11-17%.

  16. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    Science.gov (United States)

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  17. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    Science.gov (United States)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  18. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  19. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu; Duan, Benchun; Taylor, Valerie

    2011-01-01

    , such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular

  20. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    Science.gov (United States)

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  1. Earthquake experience suggests new approach to seismic criteria

    International Nuclear Information System (INIS)

    Knox, R.

    1983-01-01

    Progress in seismic qualification of nuclear power plants as reviewed at the 4th Pacific Basin Nuclear Conference in Vancouver, September 1983, is discussed. The lack of experience of earthquakes in existing nuclear plants can be compensated by the growing experience of actual earthquake effects in conventional power plants and similar installations. A survey of the effects on four power stations, with a total of twenty generating units, in the area strongly shaken by the San Fernando earthquake in California in 1971 is reported. The Canadian approach to seismic qualification, international criteria, Canadian/Korean experience, safety related equipment, the Tadotsu test facility and seismic tests are discussed. (U.K.)

  2. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  3. The 2015 Fillmore earthquake swarm and possible crustal deformation mechanisms near the bottom of the eastern Ventura Basin, California

    Science.gov (United States)

    Hauksson, Egill; Andrews, Jennifer; Plesch, Andreas; Shaw, John H.; Shelly, David R.

    2016-01-01

    The 2015 Fillmore swarm occurred about 6 km west of the city of Fillmore in Ventura, California, and was located beneath the eastern part of the actively subsiding Ventura basin at depths from 11.8 to 13.8 km, similar to two previous swarms in the area. Template‐matching event detection showed that it started on 5 July 2015 at 2:21 UTC with an M∼1.0 earthquake. The swarm exhibited unusual episodic spatial and temporal migrations and unusual diversity in the nodal planes of the focal mechanisms as compared to the simple hypocenter‐defined plane. It was also noteworthy because it consisted of >1400 events of M≥0.0, with M 2.8 being the largest event. We suggest that fluids released by metamorphic dehydration processes, migration of fluids along a detachment zone, and cascading asperity failures caused this prolific earthquake swarm, but other mechanisms (such as simple mainshock–aftershock stress triggering or a regional aseismic creep event) are less likely. Dilatant strengthening may be a mechanism that causes the temporal decay of the swarm as pore‐pressure drop increased the effective normal stress, and counteracted the instability driving the swarm.

  4. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  5. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Baseline County-Level Migration Characteristics and Trends 1995-2000 and 2001-2010

    Science.gov (United States)

    Sherrouse, Benson C.; Hester, David J.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards. In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report uses historical, estimated, and projected population data from several Federal and State data sources to estimate baseline characteristics and trends of the region's population migration (that is, changes in a person's place of residence over time). The analysis characterizes migration by various demographic, economic, family, and household variables for the period 1995-2000. It also uses existing estimates (beginning in 2001) of the three components of population change - births, deaths, and migration - to extrapolate near-term projections of county-level migration trends through 2010. The 2010 date was chosen to provide baseline projections corresponding to a two-year recovery period following the November 2008 date that was selected for the occurrence of the ShakeOut Scenario earthquake. The baseline characteristics and projections shall assist with evaluating the effects of inflow and outflow migration trends for alternative futures in which the simulated M7.8 earthquake either does or does not occur and the impact of the event on housing and jobs, as well as community composition and regional economy changes based on dispersion of intellectual, physical, economic, and cultural capital.

  6. Presentation of the National Center for Research in Vocational Education [Berkeley, California] at the AVA Annual Conference.

    Science.gov (United States)

    National Center for Research in Vocational Education, Berkeley, CA.

    This collection contains the following conference presentations about the National Center for Research in Vocational Education at the University of California at Berkeley: "Visions and Principles" (Charles Benson); "How the Center Sees Its Role" (Gordon Swanson); "The Research Agenda" (Sue Berryman); "The Service…

  7. State Emergency Response and Field Observation Activities in California (USA) during the March 11, 2011, Tohoku Tsunami

    Science.gov (United States)

    Miller, K. M.; Wilson, R. I.; Goltz, J.; Fenton, J.; Long, K.; Dengler, L.; Rosinski, A.; California Tsunami Program

    2011-12-01

    This poster will present an overview of successes and challenges observed by the authors during this major tsunami response event. The Tohoku, Japan tsunami was the most costly to affect California since the 1964 Alaskan earthquake and ensuing tsunami. The Tohoku tsunami caused at least $50 million in damage to public facilities in harbors and marinas along the coast of California, and resulted in one fatality. It was generated by a magnitude 9.0 earthquake which occurred at 9:46PM PST on Thursday, March 10, 2011 in the sea off northern Japan. The tsunami was recorded at tide gages monitored by the West Coast/Alaska Tsunami Warning Center (WCATWC), which projected tsunami surges would reach California in approximately 10 hours. At 12:51AM on March 11, 2011, based on forecasted tsunami amplitudes, the WCATWC placed the California coast north of Point Conception (Santa Barbara County) in a Tsunami Warning, and the coast south of Point Conception to the Mexican border in a Tsunami Advisory. The California Emergency Management Agency (CalEMA) activated two Regional Emergency Operation Centers (REOCs) and the State Operation Center (SOC). The California Geological Survey (CGS) deployed a field team which collected data before, during and after the event through an information clearinghouse. Conference calls were conducted hourly between the WCATWC and State Warning Center, as well as with emergency managers in the 20 coastal counties. Coordination focused on local response measures, public information messaging, assistance needs, evacuations, emergency shelters, damage, and recovery issues. In the early morning hours, some communities in low lying areas recommended evacuation for their citizens, and the fishing fleet at Crescent City evacuated to sea. The greatest damage occurred in the harbors of Crescent City and Santa Cruz. As with any emergency, there were lessons learned and important successes in managing this event. Forecasts by the WCATWC were highly accurate

  8. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  9. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    Science.gov (United States)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  10. Lessons learned from the 1994 Northridge Earthquake

    International Nuclear Information System (INIS)

    Eli, M.W.; Sommer, S.C.

    1995-01-01

    Southern California has a history of major earthquakes and also has one of the largest metropolitan areas in the United States. The 1994 Northridge Earthquake challenged the industrial facilities and lifetime infrastructure in the northern Los Angeles (LA) area. Lawrence Livermore National Laboratory (LLNL) sent a team of engineers to conduct an earthquake damage investigation in the Northridge area, on a project funded jointly by the United States Nuclear Regulatory Commission (USNRC) and the United States Department of Energy (USDOE). Many of the structures, systems, and components (SSCs) and lifelines that suffered damage are similar to those found in nuclear power plants and in USDOE facilities. Lessons learned from these experiences can have some applicability at commercial nuclear power plants

  11. Surface to 90 km winds for Kennedy Space Center, Florida, and Vandenberg AFB, California

    Science.gov (United States)

    Johnson, D. L.; Brown, S. C.

    1979-01-01

    Bivariate normal wind statistics for a 90 degree flight azimuth, from 0 through 90 km altitude, for Kennedy Space Center, Florida, and Vandenberg AFB, California are presented. Wind probability distributions and statistics for any rotation of axes can be computed from the five given parameters.

  12. Slip rate on the San Diego trough fault zone, inner California Borderland, and the 1986 Oceanside earthquake swarm revisited

    Science.gov (United States)

    Ryan, Holly F.; Conrad, James E.; Paull, C.K.; McGann, Mary

    2012-01-01

    The San Diego trough fault zone (SDTFZ) is part of a 90-km-wide zone of faults within the inner California Borderland that accommodates motion between the Pacific and North American plates. Along with most faults offshore southern California, the slip rate and paleoseismic history of the SDTFZ are unknown. We present new seismic reflection data that show that the fault zone steps across a 5-km-wide stepover to continue for an additional 60 km north of its previously mapped extent. The 1986 Oceanside earthquake swarm is located within the 20-km-long restraining stepover. Farther north, at the latitude of Santa Catalina Island, the SDTFZ bends 20° to the west and may be linked via a complex zone of folds with the San Pedro basin fault zone (SPBFZ). In a cooperative program between the U.S. Geological Survey (USGS) and the Monterey Bay Aquarium Research Institute (MBARI), we measure and date the coseismic offset of a submarine channel that intersects the fault zone near the SDTFZ–SPBFZ junction. We estimate a horizontal slip rate of about 1:5 0:3 mm=yr over the past 12,270 yr.

  13. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  14. USGS SAFRR Tsunami Scenario: Potential Impacts to the U.S. West Coast from a Plausible M9 Earthquake near the Alaska Peninsula

    Science.gov (United States)

    Ross, S.; Jones, L. M.; Wilson, R. I.; Bahng, B.; Barberopoulou, A.; Borrero, J. C.; Brosnan, D.; Bwarie, J. T.; Geist, E. L.; Johnson, L. A.; Hansen, R. A.; Kirby, S. H.; Knight, E.; Knight, W. R.; Long, K.; Lynett, P. J.; Miller, K. M.; Mortensen, C. E.; Nicolsky, D.; Oglesby, D. D.; Perry, S. C.; Porter, K. A.; Real, C. R.; Ryan, K. J.; Suleimani, E. N.; Thio, H. K.; Titov, V. V.; Wein, A. M.; Whitmore, P.; Wood, N. J.

    2012-12-01

    inform decision makers. The SAFRR Tsunami Scenario is organized by a coordinating committee with several working groups, including Earthquake Source, Paleotsunami/Geology Field Work, Tsunami Modeling, Engineering and Physical Impacts, Ecological Impacts, Emergency Management and Education, Social Vulnerability, Economic and Business Impacts, and Policy. In addition, the tsunami scenario process is being assessed and evaluated by researchers from the Natural Hazards Center at the University of Colorado at Boulder. The source event, defined by the USGS' Tsunami Source Working Group, is an earthquake similar to the 2011 Tohoku event, but set in the Semidi subduction sector, between Kodiak Island and the Shumagin Islands off the Pacific coast of the Alaska Peninsula. The Semidi sector is probably late in its earthquake cycle and comparisons of the geology and tectonic settings between Tohoku and the Semidi sector suggest that this location is appropriate. Tsunami modeling and inundation results have been generated for many areas along the California coast and elsewhere, including current velocity modeling for the ports of Los Angeles, Long Beach, and San Diego, and Ventura Harbor. Work on impacts to Alaska and Hawaii will follow. Note: Costas Synolakis (USC) is also an author of this abstract.

  15. Experimental Study on a Self-Centering Earthquake-Resistant Masonry Pier with a Structural Concrete Column

    Directory of Open Access Journals (Sweden)

    Lijun Niu

    2017-01-01

    Full Text Available This paper proposes a slotting construction strategy to avoid shear behavior of multistory masonry buildings. The aspect ratio of masonry piers increases via slotting between spandrels and piers, so that the limit state of piers under an earthquake may be altered from shear to rocking. Rocking piers with a structural concrete column (SCC form a self-centering earthquake-resistant system. The in-plane lateral rocking behavior of masonry piers subjected to an axial force is predicted, and an experimental study is conducted on two full-scale masonry piers with an SCC, which consist of a slotting pier and an original pier. Meanwhile, a comparison of the rocking modes of masonry piers with an SCC and without an SCC was conducted in the paper. Experimental verification indicates that the slotting strategy achieves a change of failure modes from shear to rocking, and this resistant system with an SCC incorporates the self-centering and high energy dissipation properties. For the slotting pier, a lateral story drift ratio of 2.5% and a high displacement ductility of approximately 9.7 are obtained in the test, although the lateral strength decreased by 22.3% after slotting. The predicted lateral strength of the rocking pier with an SCC has a margin of error of 5.3%.

  16. Real-time earthquake monitoring: Early warning and rapid response

    Science.gov (United States)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  17. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    1997-01-01

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modeling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial Valley earthquake in California (U .S .A.). The results of the study indicate that while all three approaches can successfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  18. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modelling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial valley earthquake in California (USA). The results of the study indicate that while all three approaches can succesfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  19. Measuring Aseismic Slip through Characteristically Repeating Earthquakes at the Mendocino Triple Junction, Northern California

    Science.gov (United States)

    Materna, K.; Taira, T.; Burgmann, R.

    2016-12-01

    The Mendocino Triple Junction (MTJ), at the transition point between the San Andreas fault system, the Mendocino Transform Fault, and the Cascadia Subduction Zone, undergoes rapid tectonic deformation and produces more large (M>6.0) earthquakes than any region in California. Most of the active faults of the triple junction are located offshore, making it difficult to characterize both seismic slip and aseismic creep. In this work, we study aseismic creep rates near the MTJ using characteristically repeating earthquakes (CREs) as indicators of creep rate. CREs are generally interpreted as repeated failures of the same seismic patch within an otherwise creeping fault zone; as a consequence, the magnitude and recurrence time of the CREs can be used to determine a fault's creep rate through empirically calibrated scaling relations. Using seismic data from 2010-2016, we identify CREs as recorded by an array of eight 100-Hz PBO borehole seismometers deployed in the Cape Mendocino area. For each event pair with epicenters less than 30 km apart, we compute the cross-spectral coherence of 20 seconds of data starting one second before the P-wave arrival. We then select pairs with high coherence in an appropriate frequency band, which is determined uniquely for each event pair based on event magnitude, station distance, and signal-to-noise ratio. The most similar events (with median coherence above 0.95 at two or more stations) are selected as CREs and then grouped into CRE families, and each family is used to infer a local creep rate. On the Mendocino Transform Fault, we find relatively high creep rates of >5 cm/year that increase closer to the Gorda Ridge. Closer to shore and to the MTJ itself, we find many families of repeaters on and off the transform fault with highly variable creep rates, indicative of the complex deformation that takes place there.

  20. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  1. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Directory of Open Access Journals (Sweden)

    C. H. Nelson

    2012-11-01

    Full Text Available We summarize the importance of great earthquakes (Mw ≳ 8 for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1 radiometric dating (14C method, and (2 relative dating, using hemipelagic sediment thickness and sedimentation rates (H method. The H method provides (1 the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2 the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia or very close (San Andreas to the early window for another great earthquake.

    On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km than on passive margins (~1000 km. The great earthquakes along the Cascadia and northern California margins

  2. A change in fault-plane orientation between foreshocks and aftershocks of the Galway Lake earthquake, ML = 5.2, 1975, Mojave desert, California

    Science.gov (United States)

    Fuis, G.S.; Lindh, A.G.

    1979-01-01

    /pcsp, are observed, and these changes accompany the changes in P/SV. Observations for the Galway Lake earthquake are similar to observations for the Oroville, California, earthquake (ML = 5.7) of August 1, 1975, and the Brianes Hills, California, earthquake (ML = 4.3) of January 8, 1977 (Lindh et al., Science Vol. 201, pp. 56-59). A change in fault-plane orientation between foreshocks and aftershocks may be understandable in terms of early en-echelon cracking (foreshocks) giving way to shear on the main fault plane (main shock plus aftershocks). Recent laboratory data (Byerlee et al., Tectonophysics, Vol. 44, pp. 161-171) tend to support this view. ?? 1979.

  3. Coping with earthquakes induced by fluid injection

    Science.gov (United States)

    McGarr, Arthur F.; Bekins, Barbara; Burkardt, Nina; Dewey, James W.; Earle, Paul S.; Ellsworth, William L.; Ge, Shemin; Hickman, Stephen H.; Holland, Austin F.; Majer, Ernest; Rubinstein, Justin L.; Sheehan, Anne

    2015-01-01

    Large areas of the United States long considered geologically stable with little or no detected seismicity have recently become seismically active. The increase in earthquake activity began in the mid-continent starting in 2001 (1) and has continued to rise. In 2014, the rate of occurrence of earthquakes with magnitudes (M) of 3 and greater in Oklahoma exceeded that in California (see the figure). This elevated activity includes larger earthquakes, several with M > 5, that have caused significant damage (2, 3). To a large extent, the increasing rate of earthquakes in the mid-continent is due to fluid-injection activities used in modern energy production (1, 4, 5). We explore potential avenues for mitigating effects of induced seismicity. Although the United States is our focus here, Canada, China, the UK, and others confront similar problems associated with oil and gas production, whereas quakes induced by geothermal activities affect Switzerland, Germany, and others.

  4. Modernization of the Caltech/USGS Southern California Seismic Network

    Science.gov (United States)

    Bhadha, R.; Devora, A.; Hauksson, E.; Johnson, D.; Thomas, V.; Watkins, M.; Yip, R.; Yu, E.; Given, D.; Cone, G.; Koesterer, C.

    2009-12-01

    The USGS/ANSS/ARRA program is providing Government Furnished Equipment (GFE), and two year funding for upgrading the Caltech/USGS Southern California Seismic Network (SCSN). The SCSN is the modern digital ground motion seismic network in southern California that monitors seismicity and provides real-time earthquake information products such as rapid notifications, moment tensors, and ShakeMap. The SCSN has evolved through the years and now consists of several well-integrated components such as Short-Period analog, TERRAscope, digital stations, and real-time strong motion stations, or about 300 stations. In addition, the SCSN records data from about 100 stations provided by partner networks. To strengthen the ability of SCSN to meet the ANSS performance standards, we will install GFE and carry out the following upgrades and improvements of the various components of the SCSN: 1) Upgrade of dataloggers at seven TERRAscope stations; 2) Upgrade of dataloggers at 131 digital stations and upgrade broadband sensors at 25 stations; 3) Upgrade of SCSN metadata capabilities; 4) Upgrade of telemetry capabilities for both seismic and GPS data; and 5) Upgrade balers at stations with existing Q330 dataloggers. These upgrades will enable the SCSN to meet the ANSS Performance Standards more consistently than before. The new equipment will improve station uptimes and reduce maintenance costs. The new equipment will also provide improved waveform data quality and consequently superior data products. The data gaps due to various outages will be minimized, and ‘late’ data will be readily available through retrieval from on-site storage. Compared to the outdated equipment, the new equipment will speed up data delivery by about 10 sec, which is fast enough for earthquake early warning applications. The new equipment also has about a factor of ten lower consumption of power. We will also upgrade the SCSN data acquisition and data center facilities, which will improve the SCSN

  5. Re-centering variable friction device for vibration control of structures subjected to near-field earthquakes

    Science.gov (United States)

    Ozbulut, Osman E.; Hurlebaus, Stefan

    2011-11-01

    This paper proposes a re-centering variable friction device (RVFD) for control of civil structures subjected to near-field earthquakes. The proposed hybrid device has two sub-components. The first sub-component of this hybrid device consists of shape memory alloy (SMA) wires that exhibit a unique hysteretic behavior and full recovery following post-transformation deformations. The second sub-component of the hybrid device consists of variable friction damper (VFD) that can be intelligently controlled for adaptive semi-active behavior via modulation of its voltage level. In general, installed SMA devices have the ability to re-center structures at the end of the motion and VFDs can increase the energy dissipation capacity of structures. The full realization of these devices into a singular, hybrid form which complements the performance of each device is investigated in this study. A neuro-fuzzy model is used to capture rate- and temperature-dependent nonlinear behavior of the SMA components of the hybrid device. An optimal fuzzy logic controller (FLC) is developed to modulate voltage level of VFDs for favorable performance in a RVFD hybrid application. To obtain optimal controllers for concurrent mitigation of displacement and acceleration responses, tuning of governing fuzzy rules is conducted by a multi-objective heuristic optimization. Then, numerical simulation of a multi-story building is conducted to evaluate the performance of the hybrid device. Results show that a re-centering variable friction device modulated with a fuzzy logic control strategy can effectively reduce structural deformations without increasing acceleration response during near-field earthquakes.

  6. Analysis of the burns profile and the admission rate of severely burned adult patient to the National Burn Center of Chile after the 2010 earthquake.

    Science.gov (United States)

    Albornoz, Claudia; Villegas, Jorge; Sylvester, Marilu; Peña, Veronica; Bravo, Iside

    2011-06-01

    Chile is located in the Ring of Fire, in South America. An earthquake 8.8° affected 80% of the population in February 27th, 2010. This study was conducted to assess any change in burns profile caused by the earthquake. This was an ecologic study. We compared the 4 months following the earthquake in 2009 and 2010. age, TBSA, deep TBSA, agent, specific mortality rate and rate of admissions to the National burn Center of Chile. Mann-Whitney test and a Poisson regression were performed. Age, agent, TBSA and deep TBSA percentages did not show any difference. Mortality rate was lower in 2010 (0.52 versus 1.22 per 1,000,000 habitants) but no meaningful difference was found (Poisson regression p = 0.06). Admission rate was lower in 2010, 4.6 versus 5.6 per 1,000,000 habitants, but no differences were found (p = 0.26). There was not any admissions directly related to the earthquake. As we do not have incidence registries in Chile, we propose to use the rate of admission to the National Burn Reference Center as an incidence estimator. There was not any significant difference in the burn profile, probably because of the time of the earthquake (3 am). We conclude the earthquake did not affect the way the Chilean people get burned. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.

  7. Recovering from the ShakeOut earthquake

    Science.gov (United States)

    Wein, Anne; Johnson, Laurie; Bernknopf, Richard

    2011-01-01

    Recovery from an earthquake like the M7.8 ShakeOut Scenario will be a major endeavor taking many years to complete. Hundreds of Southern California municipalities will be affected; most lack recovery plans or previous disaster experience. To support recovery planning this paper 1) extends the regional ShakeOut Scenario analysis into the recovery period using a recovery model, 2) localizes analyses to identify longer-term impacts and issues in two communities, and 3) considers the regional context of local recovery.Key community insights about preparing for post-disaster recovery include the need to: geographically diversify city procurement; set earthquake mitigation priorities for critical infrastructure (e.g., airport), plan to replace mobile homes with earthquake safety measures, consider post-earthquake redevelopment opportunities ahead of time, and develop post-disaster recovery management and governance structures. This work also showed that communities with minor damages are still sensitive to regional infrastructure damages and their potential long-term impacts on community recovery. This highlights the importance of community and infrastructure resilience strategies as well.

  8. Along-strike variations in fault frictional properties along the San Andreas Fault near Cholame, California from joint earthquake and low-frequency earthquake relocations

    Science.gov (United States)

    Harrington, Rebecca M.; Cochran, Elizabeth S.; Griffiths, Emily M.; Zeng, Xiangfang; Thurber, Clifford H.

    2016-01-01

    Recent observations of low‐frequency earthquakes (LFEs) and tectonic tremor along the Parkfield–Cholame segment of the San Andreas fault suggest slow‐slip earthquakes occur in a transition zone between the shallow fault, which accommodates slip by a combination of aseismic creep and earthquakes (fault, which accommodates slip by stable sliding (>35  km depth). However, the spatial relationship between shallow earthquakes and LFEs remains unclear. Here, we present precise relocations of 34 earthquakes and 34 LFEs recorded during a temporary deployment of 13 broadband seismic stations from May 2010 to July 2011. We use the temporary array waveform data, along with data from permanent seismic stations and a new high‐resolution 3D velocity model, to illuminate the fine‐scale details of the seismicity distribution near Cholame and the relation to the distribution of LFEs. The depth of the boundary between earthquakes and LFE hypocenters changes along strike and roughly follows the 350°C isotherm, suggesting frictional behavior may be, in part, thermally controlled. We observe no overlap in the depth of earthquakes and LFEs, with an ∼5  km separation between the deepest earthquakes and shallowest LFEs. In addition, clustering in the relocated seismicity near the 2004 Mw 6.0 Parkfield earthquake hypocenter and near the northern boundary of the 1857 Mw 7.8 Fort Tejon rupture may highlight areas of frictional heterogeneities on the fault where earthquakes tend to nucleate.

  9. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  10. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  11. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas fault.

    Science.gov (United States)

    Shelly, David R

    2010-06-11

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between approximately 3 and approximately 6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  12. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas Fault

    Science.gov (United States)

    Shelly, David R.

    2010-01-01

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between ~3 and ~6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  13. Final Report Feasibility Study for the California Wave Energy Test Center (CalWavesm)

    Energy Technology Data Exchange (ETDEWEB)

    Blakeslee, Samuel Norman [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States). Inst. for Advanced Technology and Public Policy; Toman, William I. [Protean Wave Energy Ltd., Los Osos, CA (United States); Williams, Richard B. [Leidos Maritime Solutions, Reston, VA (United States); Davy, Douglas M. [CH2M, Sacramento, CA (United States); West, Anna [Kearns and West, Inc., San Francisco, CA (United States); Connet, Randy M. [Omega Power Engineers, LLC, Anaheim, CA (United States); Thompson, Janet [Kearns and West, Inc., San Francisco, CA (United States); Dolan, Dale [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Baltimore, Craig [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Jacobson, Paul [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Hagerman, George [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Potter, Chris [California Natural Resources Agency, Sacramento, CA (United States); Dooher, Brendan [Pacific Gas and Electric Company, San Francisco, CA (United States); Wendt, Dean [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Sheppard, Colin [Humboldt State Univ., Arcata, CA (United States); Harris, Andrew [Humboldt State Univ., Arcata, CA (United States); Lawson, W. Graham [Power Delivery Consultants, Inc., Albany, NY (United States)

    2017-07-31

    The California Wave Energy Test Center (CalWave) Feasibility Study project was funded over multiple phases by the Department of Energy to perform an interdisciplinary feasibility assessment to analyze the engineering, permitting, and stakeholder requirements to establish an open water, fully energetic, grid connected, wave energy test center off the coast of California for the purposes of advancing U.S. wave energy research, development, and testing capabilities. Work under this grant included wave energy resource characterization, grid impact and interconnection requirements, port infrastructure and maritime industry capability/suitability to accommodate the industry at research, demonstration and commercial scale, and macro and micro siting considerations. CalWave Phase I performed a macro-siting and down-selection process focusing on two potential test sites in California: Humboldt Bay and Vandenberg Air Force Base. This work resulted in the Vandenberg Air Force Base site being chosen as the most favorable site based on a peer reviewed criteria matrix. CalWave Phase II focused on four siting location alternatives along the Vandenberg Air Force Base coastline and culminated with a final siting down-selection. Key outcomes from this work include completion of preliminary engineering and systems integration work, a robust turnkey cost estimate, shoreside and subsea hazards assessment, storm wave analysis, lessons learned reports from several maritime disciplines, test center benchmarking as compared to existing international test sites, analysis of existing applicable environmental literature, the completion of a preliminary regulatory, permitting and licensing roadmap, robust interaction and engagement with state and federal regulatory agency personnel and local stakeholders, and the population of a Draft Federal Energy Regulatory Commission (FERC) Preliminary Application Document (PAD). Analysis of existing offshore oil and gas infrastructure was also performed

  14. Deeper penetration of large earthquakes on seismically quiescent faults.

    Science.gov (United States)

    Jiang, Junle; Lapusta, Nadia

    2016-06-10

    Why many major strike-slip faults known to have had large earthquakes are silent in the interseismic period is a long-standing enigma. One would expect small earthquakes to occur at least at the bottom of the seismogenic zone, where deeper aseismic deformation concentrates loading. We suggest that the absence of such concentrated microseismicity indicates deep rupture past the seismogenic zone in previous large earthquakes. We support this conclusion with numerical simulations of fault behavior and observations of recent major events. Our modeling implies that the 1857 Fort Tejon earthquake on the San Andreas Fault in Southern California penetrated below the seismogenic zone by at least 3 to 5 kilometers. Our findings suggest that such deeper ruptures may occur on other major fault segments, potentially increasing the associated seismic hazard. Copyright © 2016, American Association for the Advancement of Science.

  15. Holocene slip rates along the San Andreas Fault System in the San Gorgonio Pass and implications for large earthquakes in southern California

    Science.gov (United States)

    Heermance, Richard V.; Yule, Doug

    2017-06-01

    The San Gorgonio Pass (SGP) in southern California contains a 40 km long region of structural complexity where the San Andreas Fault (SAF) bifurcates into a series of oblique-slip faults with unknown slip history. We combine new 10Be exposure ages (Qt4: 8600 (+2100, -2200) and Qt3: 5700 (+1400, -1900) years B.P.) and a radiocarbon age (1260 ± 60 years B.P.) from late Holocene terraces with scarp displacement of these surfaces to document a Holocene slip rate of 5.7 (+2.7, -1.5) mm/yr combined across two faults. Our preferred slip rate is 37-49% of the average slip rates along the SAF outside the SGP (i.e., Coachella Valley and San Bernardino sections) and implies that strain is transferred off the SAF in this area. Earthquakes here most likely occur in very large, throughgoing SAF events at a lower recurrence than elsewhere on the SAF, so that only approximately one third of SAF ruptures penetrate or originate in the pass.Plain Language SummaryHow large are earthquakes on the southern San Andreas Fault? The answer to this question depends on whether or not the earthquake is contained only along individual fault sections, such as the Coachella Valley section north of Palm Springs, or the rupture crosses multiple sections including the area through the San Gorgonio Pass. We have determined the age and offset of faulted stream deposits within the San Gorgonio Pass to document slip rates of these faults over the last 10,000 years. Our results indicate a long-term slip rate of 6 mm/yr, which is almost 1/2 of the rates east and west of this area. These new rates, combined with faulted geomorphic surfaces, imply that large magnitude earthquakes must occasionally rupture a 300 km length of the San Andreas Fault from the Salton Sea to the Mojave Desert. Although many ( 65%) earthquakes along the southern San Andreas Fault likely do not rupture through the pass, our new results suggest that large >Mw 7.5 earthquakes are possible on the southern San Andreas Fault and likely

  16. Echo-sounding method aids earthquake hazard studies

    Science.gov (United States)

    ,

    1995-01-01

    Dramatic examples of catastrophic damage from an earthquake occurred in 1989, when the M 7.1 Lorna Prieta rocked the San Francisco Bay area, and in 1994, when the M 6.6 Northridge earthquake jolted southern California. The surprising amount and distribution of damage to private property and infrastructure emphasizes the importance of seismic-hazard research in urbanized areas, where the potential for damage and loss of life is greatest. During April 1995, a group of scientists from the U.S. Geological Survey and the University of Tennessee, using an echo-sounding method described below, is collecting data in San Antonio Park, California, to examine the Monte Vista fault which runs through this park. The Monte Vista fault in this vicinity shows evidence of movement within the last 10,000 years or so. The data will give them a "picture" of the subsurface rock deformation near this fault. The data will also be used to help locate a trench that will be dug across the fault by scientists from William Lettis & Associates.

  17. Hippotherapy: Remuneration issues impair the offering of this therapeutic strategy at Southern California rehabilitation centers.

    Science.gov (United States)

    Pham, Christine; Bitonte, Robert

    2016-04-06

    Hippotherapy is the use of equine movement in physical, occupational, or speech therapy in order to obtain functional improvements in patients. Studies show improvement in motor function and sensory processing for patients with a variety of neuromuscular disabilities, developmental disorders, or skeletal impairments as a result of using hippotherapy. The primary objective of this study is to identify the pervasiveness of hippotherapy in Southern California, and any factors that impair its utilization. One hundred and fifty-two rehabilitation centers in the Southern California counties of Los Angeles, San Diego, Orange, Riverside, San Bernardino, San Diego, San Luis Obispo, Santa Barbara, Ventura, and Kern County were identified, and surveyed to ascertain if hippotherapy is utilized, and if not, why not. Through a review of forty facilities that responded to our inquiry, our study indicates that the majority of rehabilitation centers are familiar with hippotherapy, however, only seven have reported that hippotherapy is indeed available as an option in therapy at their centers. It is concluded that hippotherapy, used in a broad based array of physical and sensory disorders, is limited in its ability to be utilized, primarily due to remuneration issues.

  18. Earthquake Resilient Bridge Columns Utilizing Damage Resistant Hybrid Fiber Reinforced Concrete

    OpenAIRE

    Trono, William Dean

    2014-01-01

    Modern reinforced concrete bridges are designed to avoid collapse and to prevent loss of life during earthquakes. To meet these objectives, bridge columns are typically detailed to form ductile plastic hinges when large displacements occur. California seismic design criteria acknowledges that damage such as concrete cover spalling and reinforcing bar yielding may occur in columns during a design-level earthquake. The seismic resilience of bridge columns can be improved through the use of a da...

  19. What Can We Learn from a Simple Physics-Based Earthquake Simulator?

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2018-03-01

    Physics-based earthquake simulators are becoming a popular tool to investigate on the earthquake occurrence process. So far, the development of earthquake simulators is commonly led by the approach "the more physics, the better". However, this approach may hamper the comprehension of the outcomes of the simulator; in fact, within complex models, it may be difficult to understand which physical parameters are the most relevant to the features of the seismic catalog at which we are interested. For this reason, here, we take an opposite approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple simulator may be more informative than a complex one for some specific scientific objectives, because it is more understandable. Our earthquake simulator has three main components: the first one is a realistic tectonic setting, i.e., a fault data set of California; the second is the application of quantitative laws for earthquake generation on each single fault, and the last is the fault interaction modeling through the Coulomb Failure Function. The analysis of this simple simulator shows that: (1) the short-term clustering can be reproduced by a set of faults with an almost periodic behavior, which interact according to a Coulomb failure function model; (2) a long-term behavior showing supercycles of the seismic activity exists only in a markedly deterministic framework, and quickly disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault; (3) faults that are strongly coupled in terms of Coulomb failure function model are synchronized in time only in a marked deterministic framework, and as before, such a synchronization disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault. Overall, the results show that even in a simple and perfectly known earthquake occurrence world, introducing a small degree of

  20. Stability and uncertainty of finite-fault slip inversions: Application to the 2004 Parkfield, California, earthquake

    Science.gov (United States)

    Hartzell, S.; Liu, P.; Mendoza, C.; Ji, C.; Larson, K.M.

    2007-01-01

    The 2004 Parkfield, California, earthquake is used to investigate stability and uncertainty aspects of the finite-fault slip inversion problem with different a priori model assumptions. We utilize records from 54 strong ground motion stations and 13 continuous, 1-Hz sampled, geodetic instruments. Two inversion procedures are compared: a linear least-squares subfault-based methodology and a nonlinear global search algorithm. These two methods encompass a wide range of the different approaches that have been used to solve the finite-fault slip inversion problem. For the Parkfield earthquake and the inversion of velocity or displacement waveforms, near-surface related site response (top 100 m, frequencies above 1 Hz) is shown to not significantly affect the solution. Results are also insensitive to selection of slip rate functions with similar duration and to subfault size if proper stabilizing constraints are used. The linear and nonlinear formulations yield consistent results when the same limitations in model parameters are in place and the same inversion norm is used. However, the solution is sensitive to the choice of inversion norm, the bounds on model parameters, such as rake and rupture velocity, and the size of the model fault plane. The geodetic data set for Parkfield gives a slip distribution different from that of the strong-motion data, which may be due to the spatial limitation of the geodetic stations and the bandlimited nature of the strong-motion data. Cross validation and the bootstrap method are used to set limits on the upper bound for rupture velocity and to derive mean slip models and standard deviations in model parameters. This analysis shows that slip on the northwestern half of the Parkfield rupture plane from the inversion of strong-motion data is model dependent and has a greater uncertainty than slip near the hypocenter.

  1. The Relationship between Knowledge and Attitude of Managers with Preparedness of Healthcare Centers in Rey Health Network against Earthquake Risk - 2013

    Directory of Open Access Journals (Sweden)

    Mohammad Asadzadeh

    2014-06-01

    Conclusions: Considering that managers’ knowledge was rather low, preparedness among centers was low as well. According to low knowledge and unsuitable preparedness, more theoretical and practical trainings and maneuvers were necessary to be held for managers about earthquake preparedness.

  2. Seismogeodesy of the 2014 Mw6.1 Napa earthquake, California: Rapid response and modeling of fast rupture on a dipping strike-slip fault

    Science.gov (United States)

    Melgar, Diego; Geng, Jianghui; Crowell, Brendan W.; Haase, Jennifer S.; Bock, Yehuda; Hammond, William C.; Allen, Richard M.

    2015-07-01

    Real-time high-rate geodetic data have been shown to be useful for rapid earthquake response systems during medium to large events. The 2014 Mw6.1 Napa, California earthquake is important because it provides an opportunity to study an event at the lower threshold of what can be detected with GPS. We show the results of GPS-only earthquake source products such as peak ground displacement magnitude scaling, centroid moment tensor (CMT) solution, and static slip inversion. We also highlight the retrospective real-time combination of GPS and strong motion data to produce seismogeodetic waveforms that have higher precision and longer period information than GPS-only or seismic-only measurements of ground motion. We show their utility for rapid kinematic slip inversion and conclude that it would have been possible, with current real-time infrastructure, to determine the basic features of the earthquake source. We supplement the analysis with strong motion data collected close to the source to obtain an improved postevent image of the source process. The model reveals unilateral fast propagation of slip to the north of the hypocenter with a delayed onset of shallow slip. The source model suggests that the multiple strands of observed surface rupture are controlled by the shallow soft sediments of Napa Valley and do not necessarily represent the intersection of the main faulting surface and the free surface. We conclude that the main dislocation plane is westward dipping and should intersect the surface to the east, either where the easternmost strand of surface rupture is observed or at the location where the West Napa fault has been mapped in the past.

  3. Detecting Faults in Southern California using Computer-Vision Techniques and Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) Interferometry

    Science.gov (United States)

    Barba, M.; Rains, C.; von Dassow, W.; Parker, J. W.; Glasscoe, M. T.

    2013-12-01

    Knowing the location and behavior of active faults is essential for earthquake hazard assessment and disaster response. In Interferometric Synthetic Aperture Radar (InSAR) images, faults are revealed as linear discontinuities. Currently, interferograms are manually inspected to locate faults. During the summer of 2013, the NASA-JPL DEVELOP California Disasters team contributed to the development of a method to expedite fault detection in California using remote-sensing technology. The team utilized InSAR images created from polarimetric L-band data from NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) project. A computer-vision technique known as 'edge-detection' was used to automate the fault-identification process. We tested and refined an edge-detection algorithm under development through NASA's Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) project. To optimize the algorithm we used both UAVSAR interferograms and synthetic interferograms generated through Disloc, a web-based modeling program available through NASA's QuakeSim project. The edge-detection algorithm detected seismic, aseismic, and co-seismic slip along faults that were identified and compared with databases of known fault systems. Our optimization process was the first step toward integration of the edge-detection code into E-DECIDER to provide decision support for earthquake preparation and disaster management. E-DECIDER partners that will use the edge-detection code include the California Earthquake Clearinghouse and the US Department of Homeland Security through delivery of products using the Unified Incident Command and Decision Support (UICDS) service. Through these partnerships, researchers, earthquake disaster response teams, and policy-makers will be able to use this new methodology to examine the details of ground and fault motions for moderate to large earthquakes. Following an earthquake, the newly discovered faults can

  4. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2012-08-31

    ... DEPARTMENT OF THE INTERIOR Geological Survey [USGS-GX12GG00995NP00] National Earthquake Prediction... meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... National Earthquake Information Center (NEIC), 1711 Illinois Avenue, Golden, Colorado 80401. The Council is...

  5. Earthquake Swarm Along the San Andreas Fault near Palmdale, Southern California, 1976 to 1977.

    Science.gov (United States)

    McNally, K C; Kanamori, H; Pechmann, J C; Fuis, G

    1978-09-01

    Between November 1976 and November 1977 a swarm of small earthquakes (local magnitude foreshock sequences, such as tight clustering of hypocenters and time-dependent rotations of stress axes inferred from focal mechanisms. However, because of our present lack of understanding of the processes that precede earthquake faulting, the implications of the swarm for future large earthquakes on the San Andreas fault are unknown.

  6. Sonoma Ecology Center Northern California Arundo Distribution Data

    Data.gov (United States)

    California Natural Resource Agency — The Arundo Distribution layer is a compilation of Arundo donax observations in northern and central California, obtained from numerous sources, including Arundo...

  7. Site response, shallow shear-wave velocity, and damage in Los Gatos, California, from the 1989 Loma Prieta earthquake

    Science.gov (United States)

    Hartzell, S.; Carver, D.; Williams, R.A.

    2001-01-01

    Aftershock records of the 1989 Loma Prieta earthquake are used to calculate site response in the frequency band of 0.5-10 Hz at 24 locations in Los Gatos, California, on the edge of the Santa Clara Valley. Two different methods are used: spectral ratios relative to a reference site on rock and a source/site spectral inversion method. These two methods complement each other and give consistent results. Site amplification factors are compared with surficial geology, thickness of alluvium, shallow shear-wave velocity measurements, and ground deformation and structural damage resulting from the Loma Prieta earthquake. Higher values of site amplification are seen on Quaternary alluvium compared with older Miocene and Cretaceous units of Monterey and Franciscan Formation. However, other more detailed correlations with surficial geology are not evident. A complex pattern of alluvial sediment thickness, caused by crosscutting thrust faults, is interpreted as contributing to the variability in site response and the presence of spectral resonance peaks between 2 and 7 Hz at some sites. Within the range of our field measurements, there is a correlation between lower average shear-wave velocity of the top 30 m and 50% higher values of site amplification. An area of residential homes thrown from their foundations correlates with high site response. This damage may also have been aggravated by local ground deformation. Severe damage to commercial buildings in the business district, however, is attributed to poor masonry construction.

  8. Evaluation of Seismic Hazards at California Department of Transportation (CALTRANS)Structures

    Science.gov (United States)

    Merriam, M. K.

    2005-12-01

    The California Department of Transportation (CALTRANS) has responsibility for design, construction, and maintenance of approximately 12,000 state bridges. CALTRANS also provides oversight for similar activities for 12,200 bridges owned by local agencies throughout the state. California is subjected to a M6 or greater seismic event every few years. Recent earthquakes include the 1971 Mw6.6 San Fernando earthquake which struck north of Los Angeles and prompted engineers to begin retrofitting existing bridges and re-examine the way bridges are detailed to improve their response to earthquakes, the 1989 Mw6.9 Loma Prieta earthquake which destroyed the Cypress Freeway and damaged the San Francisco-Oakland Bay Bridge, and the 1994 Mw6.7 Northridge earthquake in the Los Angeles area which heavily damaged four major freeways. Since CALTRANS' seismic performance goal is to ensure life-safety needs are met for the traveling public during an earthquake, estimating earthquake magnitude, peak bedrock acceleration, and determining if special seismic considerationsare needed at specific bridge sites are critical. CALTRANS is currently developing a fourth generation seismic hazard map to be used for estimating these parameters. A deterministic approach has been used to develop this map. Late-Quaternary-age faults are defined as the expected seismic sources. Caltrans requires site-specific studies to determine potential for liquefaction, seismically induced landslides, and surface fault rupture. If potential for one of these seismic hazards exists, the hazard is mitigated by avoidance, removal, or accommodated through design. The action taken, while complying with the Department's "no collapse" requirement, depends upon many factors, including cost.

  9. The accommodation of relative motion at depth on the San Andreas fault system in California

    Science.gov (United States)

    Prescott, W. H.; Nur, A.

    1981-01-01

    Plate motion below the seismogenic layer along the San Andreas fault system in California is assumed to form by aseismic slip along a deeper extension of the fault or may result from lateral distribution of deformation below the seismogenic layer. The shallow depth of California earthquakes, the depth of the coseismic slip during the 1906 San Francisco earthquake, and the presence of widely separated parallel faults indicate that relative motion is distributed below the seismogenic zone, occurring by inelastic flow rather than by aseismic slip on discrete fault planes.

  10. The 2007 Nazko, British Columbia, earthquake sequence: Injection of magma deep in the crust beneath the Anahim volcanic belt

    Science.gov (United States)

    Cassidy, J.F.; Balfour, N.; Hickson, C.; Kao, H.; White, Rickie; Caplan-Auerbach, J.; Mazzotti, S.; Rogers, Gary C.; Al-Khoubbi, I.; Bird, A.L.; Esteban, L.; Kelman, M.; Hutchinson, J.; McCormack, D.

    2011-01-01

    On 9 October 2007, an unusual sequence of earthquakes began in central British Columbia about 20 km west of the Nazko cone, the most recent (circa 7200 yr) volcanic center in the Anahim volcanic belt. Within 25 hr, eight earthquakes of magnitude 2.3-2.9 occurred in a region where no earthquakes had previously been recorded. During the next three weeks, more than 800 microearthquakes were located (and many more detected), most at a depth of 25-31 km and within a radius of about 5 km. After about two months, almost all activity ceased. The clear P- and S-wave arrivals indicated that these were high-frequency (volcanic-tectonic) earthquakes and the b value of 1.9 that we calculated is anomalous for crustal earthquakes but consistent with volcanic-related events. Analysis of receiver functions at a station immediately above the seismicity indicated a Moho near 30 km depth. Precise relocation of the seismicity using a double-difference method suggested a horizontal migration at the rate of about 0:5 km=d, with almost all events within the lowermost crust. Neither harmonic tremor nor long-period events were observed; however, some spasmodic bursts were recorded and determined to be colocated with the earthquake hypocenters. These observations are all very similar to a deep earthquake sequence recorded beneath Lake Tahoe, California, in 2003-2004. Based on these remarkable similarities, we interpret the Nazko sequence as an indication of an injection of magma into the lower crust beneath the Anahim volcanic belt. This magma injection fractures rock, producing high-frequency, volcanic-tectonic earthquakes and spasmodic bursts.

  11. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Science.gov (United States)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which

  12. Slip in the 1857 and earlier large earthquakes along the Carrizo Plain, San Andreas Fault.

    Science.gov (United States)

    Zielke, Olaf; Arrowsmith, J Ramón; Grant Ludwig, Lisa; Akçiz, Sinan O

    2010-02-26

    The moment magnitude (Mw) 7.9 Fort Tejon earthquake of 1857, with a approximately 350-kilometer-long surface rupture, was the most recent major earthquake along the south-central San Andreas Fault, California. Based on previous measurements of its surface slip distribution, rupture along the approximately 60-kilometer-long Carrizo segment was thought to control the recurrence of 1857-like earthquakes. New high-resolution topographic data show that the average slip along the Carrizo segment during the 1857 event was 5.3 +/- 1.4 meters, eliminating the core assumption for a linkage between Carrizo segment rupture and recurrence of major earthquakes along the south-central San Andreas Fault. Earthquake slip along the Carrizo segment may recur in earthquake clusters with cumulative slip of approximately 5 meters.

  13. Economic consequences of earthquakes: bridging research and practice with HayWired

    Science.gov (United States)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  14. Earthquake Strong Ground Motion Scenario at the 2008 Olympic Games Sites, Beijing, China

    Science.gov (United States)

    Liu, L.; Rohrbach, E. A.; Chen, Q.; Chen, Y.

    2006-12-01

    Historic earthquake record indicates mediate to strong earthquakes have been frequently hit greater Beijing metropolitan area where is going to host the 2008 summer Olympic Games. For the readiness preparation of emergency response to the earthquake shaking for a mega event in a mega city like Beijing in summer 2008, this paper tries to construct the strong ground motion scenario at a number of gymnasium sites for the 2008 Olympic Games. During the last 500 years (the Ming and Qing Dynasties) in which the historic earthquake record are thorough and complete, there are at least 12 earthquake events with the maximum intensity of VI or greater occurred within 100 km radius centered at the Tiananmen Square, the center of Beijing City. Numerical simulation of the seismic wave propagation and surface strong ground motion is carried out by the pseudospectral time domain methods with viscoelastic material properties. To improve the modeling efficiency and accuracy, a multi-scale approach is adapted: the seismic wave propagation originated from an earthquake rupture source is first simulated by a model with larger physical domain with coarser grids. Then the wavefield at a given plane is taken as the source input for the small-scale, fine grid model for the strong ground motion study at the sites. The earthquake source rupture scenario is based on two particular historic earthquake events: One is the Great 1679 Sanhe-Pinggu Earthquake (M~8, Maximum Intensity XI at the epicenter and Intensity VIII in city center)) whose epicenter is about 60 km ENE of the city center. The other one is the 1730 Haidian Earthquake (M~6, Maximum Intensity IX at the epicenter and Intensity VIII in city center) with the epicentral distance less than 20 km away from the city center in the NW Haidian District. The exist of the thick Tertiary-Quaternary sediments (maximum thickness ~ 2 km) in Beijing area plays a critical role on estimating the surface ground motion at the Olympic Games sites, which

  15. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    Science.gov (United States)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10

  16. Chile Earthquake: U.S. and International Response

    Science.gov (United States)

    2010-03-11

    most regions far from the epicenter did not experience any serious damage. A tsunami caused significant damage to the city of Hilo , Hawaii ...Tsunami Warning Center for Hawaii , Japan, and other regions bordering the Pacific Ocean that may have been vulnerable to a damaging tsunami, although...earthquake. Why the 1960 earthquake generated a tsunami that caused damage and fatalities in Hawaii , Japan, and the Philippines while the 2010 earthquake did

  17. Earthquake geology and paleoseismology of major strands of the San Andreas fault system: Chapter 38

    Science.gov (United States)

    Rockwell, Thomas; Scharer, Katherine M.; Dawson, Timothy E.

    2016-01-01

    The San Andreas fault system in California is one of the best-studied faults in the world, both in terms of the long-term geologic history and paleoseismic study of past surface ruptures. In this paper, we focus on the Quaternary to historic data that have been collected from the major strands of the San Andreas fault system, both on the San Andreas Fault itself, and the major subparallel strands that comprise the plate boundary, including the Calaveras-Hayward- Rogers Creek-Maacama fault zone and the Concord-Green Valley-Bartlett Springs fault zone in northern California, and the San Jacinto and Elsinore faults in southern California. The majority of the relative motion between the Pacific and North American lithospheric plates is accommodated by these faults, with the San Andreas slipping at about 34 mm/yr in central California, decreasing to about 20 mm/yr in northern California north of its juncture with the Calaveras and Concord faults. The Calaveras-Hayward-Rogers Creek-Maacama fault zone exhibits a slip rate of 10-15 mm/yr, whereas the rate along the Concord-Green Valley-Bartlett Springs fault zone is lower at about 5 mm/yr. In southern California, the San Andreas exhibits a slip rate of about 35 mm/yr along the Mojave section, decreasing to as low as 10-15 mm/yr along its juncture with the San Jacinto fault, and about 20 mm/yr in the Coachella Valley. The San Jacinto and Elsinore fault zones exhibit rates of about 15 and 5 mm/yr, respectively. The average recurrence interval for surface-rupturing earthquakes along individual elements of the San Andreas fault system range from 100-500 years and is consistent with slip rate at those sites: higher slip rates produce more frequent or larger earthquakes. There is also evidence of short-term variations in strain release (slip rate) along various fault sections, as expressed as “flurries” or clusters of earthquakes as well as periods of relatively fewer surface ruptures in these relatively short records. This

  18. Shallow Crustal Structure in the Northern Salton Trough, California: Insights from a Detailed 3-D Velocity Model

    Science.gov (United States)

    Ajala, R.; Persaud, P.; Stock, J. M.; Fuis, G. S.; Hole, J. A.; Goldman, M.; Scheirer, D. S.

    2017-12-01

    The Coachella Valley is the northern extent of the Gulf of California-Salton Trough. It contains the southernmost segment of the San Andreas Fault (SAF) for which a magnitude 7.8 earthquake rupture was modeled to help produce earthquake planning scenarios. However, discrepancies in ground motion and travel-time estimates from the current Southern California Earthquake Center (SCEC) velocity model of the Salton Trough highlight inaccuracies in its shallow velocity structure. An improved 3-D velocity model that better defines the shallow basin structure and enables the more accurate location of earthquakes and identification of faults is therefore essential for seismic hazard studies in this area. We used recordings of 126 explosive shots from the 2011 Salton Seismic Imaging Project (SSIP) to SSIP receivers and Southern California Seismic Network (SCSN) stations. A set of 48,105 P-wave travel time picks constituted the highest-quality input to a 3-D tomographic velocity inversion. To improve the ray coverage, we added network-determined first arrivals at SCSN stations from 39,998 recently relocated local earthquakes, selected to a maximum focal depth of 10 km, to develop a detailed 3-D P-wave velocity model for the Coachella Valley with 1-km grid spacing. Our velocity model shows good resolution ( 50 rays/cubic km) down to a minimum depth of 7 km. Depth slices from the velocity model reveal several interesting features. At shallow depths ( 3 km), we observe an elongated trough of low velocity, attributed to sediments, located subparallel to and a few km SW of the SAF, and a general velocity structure that mimics the surface geology of the area. The persistence of the low-velocity sediments to 5-km depth just north of the Salton Sea suggests that the underlying basement surface, shallower to the NW, dips SE, consistent with interpretation from gravity studies (Langenheim et al., 2005). On the western side of the Coachella Valley, we detect depth-restricted regions of

  19. ShakeAlert—An earthquake early warning system for the United States west coast

    Science.gov (United States)

    Burkett, Erin R.; Given, Douglas D.; Jones, Lucile M.

    2014-08-29

    Earthquake early warning systems use earthquake science and the technology of monitoring systems to alert devices and people when shaking waves generated by an earthquake are expected to arrive at their location. The seconds to minutes of advance warning can allow people and systems to take actions to protect life and property from destructive shaking. The U.S. Geological Survey (USGS), in collaboration with several partners, has been working to develop an early warning system for the United States. ShakeAlert, a system currently under development, is designed to cover the West Coast States of California, Oregon, and Washington.

  20. Report on the 2010 Chilean earthquake and tsunami response

    Science.gov (United States)

    ,

    2011-01-01

    In July 2010, in an effort to reduce future catastrophic natural disaster losses for California, the American Red Cross coordinated and sent a delegation of 20 multidisciplinary experts on earthquake response and recovery to Chile. The primary goal was to understand how the Chilean society and relevant organizations responded to the magnitude 8.8 Maule earthquake that struck the region on February 27, 2010, as well as how an application of these lessons could better prepare California communities, response partners and state emergency partners for a comparable situation. Similarities in building codes, socioeconomic conditions, and broad extent of the strong shaking make the Chilean earthquake a very close analog to the impact of future great earthquakes on California. To withstand and recover from natural and human-caused disasters, it is essential for citizens and communities to work together to anticipate threats, limit effects, and rapidly restore functionality after a crisis. The delegation was hosted by the Chilean Red Cross and received extensive briefings from both national and local Red Cross officials. During nine days in Chile, the delegation also met with officials at the national, regional, and local government levels. Technical briefings were received from the President’s Emergency Committee, emergency managers from ONEMI (comparable to FEMA), structural engineers, a seismologist, hospital administrators, firefighters, and the United Nations team in Chile. Cities visited include Santiago, Talca, Constitución, Concepción, Talcahuano, Tumbes, and Cauquenes. The American Red Cross Multidisciplinary Team consisted of subject matter experts, who carried out special investigations in five Teams on the (1) science and engineering findings, (2) medical services, (3) emergency services, (4) volunteer management, and (5) executive and management issues (see appendix A for a full list of participants and their titles and teams). While developing this

  1. Seismic resistance of equipment and building service systems: review of earthquake damage design requirements, and research applications in the USA

    International Nuclear Information System (INIS)

    Skjei, R.E.; Chakravartula, B.C.; Yanev, P.I.

    1979-01-01

    The history of earthquake damage and the resulting code design requirements for earthquake hazard mitigation for equipment in the USA is reviewed. Earthquake damage to essential service systems is summarized; observations for the 1964 Alaska and the 1971 San Fernando, California, earthquakes are stressed, and information from other events is included. USA building codes that reflect lessons learned from these earthquakes are discussed; brief summaries of widely used codes are presented. In conclusion there is a discussion of the desirability of adapting advanced technological concepts from the nuclear industry to equipment in conventional structures. (author)

  2. A Kinesthetic Demonstration for Locating Earthquake Epicenters

    Science.gov (United States)

    Keyantash, J.; Sperber, S.

    2005-12-01

    During Spring 2005, an inquiry-based curriculum for plate tectonics was developed for implementation in sixth-grade classrooms within the Los Angeles Unified School District (LAUSD). Two cohorts of LAUSD teachers received training and orientation to the plate tectonics unit during one week workshops in July 2005. However, during the training workshops, it was observed that there was considerable confusion among the teachers as to how the traditional "textbook" explanation of the time lag between P and S waves on a seismogram could possibly be used to determine the epicenter of an earthquake. One of the State of California science content standards for sixth grade students is that they understand how the epicenters of earthquakes are determined, so it was critical that the teachers themselves grasped the concept. In response to the adult learner difficulties, the classroom explanation of earthquake epicenter location was supplemented with an outdoor kinesthetic activity. Based upon the experience of the kinesthetic model, it was found that the hands-on model greatly cemented the teachers' understanding of the underlying theory. This paper details the steps of the kinesthetic demonstration for earthquake epicenter identification, as well as offering extended options for its classroom implementation.

  3. Earthquake swarms and the semidiurnal solid earth tide

    Energy Technology Data Exchange (ETDEWEB)

    Klein, F W

    1976-01-01

    Several correlations between peak earthquake activity during swarms and the phase and stress orientation of the calculated solid earth tide are described. The events correlating with the tide are clusters of swarm earthquakes. Swarm clusters from many sequences recorded over several years are used. Significant tidal correlations (which have less than a 5% chance of being observed if earthquakes were random) are found in the Reykjanes Peninsula in Iceland, the central Mid-Atlantic Ridge, the Imperial Valley and northern Gulf of California, and larger (m/sub b/ greater than or equal to 5.0) aftershocks of the 1965 Rat Islands earthquake. In addition, sets of larger single earthquakes on Atlantic and north-east Pacific fracture zones are significantly correlated with the calculated solid tide. No tidal correlation, however, could be found for the Matsushiro Japan swarm of 1965 to 1967. The earthquake-tide correlations other than those of the Reykjanes Peninsula and Mid-Atlantic Ridge can be interpreted as triggering caused by enhancement of the tectonic stress by tidal stress, i.e. the alignment of fault and tidal principal stresses. All tidal correlations except in the Aleutians are associated with oceanic rifts or their landward extensions. If lithospheric plates are decoupled at active rifts, then tidal stresses channeled along the lithospheric stress guide may be concentrated at ridge-type plate boundaries. Tidal triggering of earthquakes at rifts may reflect this possible amplification of tidal strains in the weakened lithosphere at ridges. 25 figures, 2 tables.

  4. Determining on-fault earthquake magnitude distributions from integer programming

    Science.gov (United States)

    Geist, Eric L.; Parsons, Thomas E.

    2018-01-01

    Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106  variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions. 

  5. Multiple Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California

    Science.gov (United States)

    Pike, Richard J.; Graymer, Russell W.

    2008-01-01

    With the exception of Los Angeles, perhaps no urban area in the United States is more at risk from landsliding, triggered by either precipitation or earthquake, than the San Francisco Bay region of northern California. By January each year, seasonal winter storms usually bring moisture levels of San Francisco Bay region hillsides to the point of saturation, after which additional heavy rainfall may induce landslides of various types and levels of severity. In addition, movement at any time along one of several active faults in the area may generate an earthquake large enough to trigger landslides. The danger to life and property rises each year as local populations continue to expand and more hillsides are graded for development of residential housing and its supporting infrastructure. The chapters in the text consist of: *Introduction by Russell W. Graymer *Chapter 1 Rainfall Thresholds for Landslide Activity, San Francisco Bay Region, Northern California by Raymond C. Wilson *Chapter 2 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike and Steven Sobieszczyk *Chapter 3 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven Sobieszczyk *Chapter 4 Landslide Hazard Modeled for the Cities of Oakland, Piedmont, and Berkeley, Northern California, from a M=7.1 Scenario Earthquake on the Hayward Fault Zone by Scott B. Miles and David K. Keefer *Chapter 5 Synthesis of Landslide-Hazard Scenarios Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike The plates consist of: *Plate 1 Susceptibility to Deep-Seated Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Richard J. Pike, Russell W. Graymer, Sebastian Roberts, Naomi B. Kalman, and Steven Sobieszczyk *Plate 2 Susceptibility to Shallow Landsliding Modeled for the Oakland-Berkeley Area, Northern California by Kevin M. Schmidt and Steven

  6. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    Science.gov (United States)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  7. Earthquake Early Warning: Real-time Testing of an On-site Method Using Waveform Data from the Southern California Seismic Network

    Science.gov (United States)

    Solanki, K.; Hauksson, E.; Kanamori, H.; Wu, Y.; Heaton, T.; Boese, M.

    2007-12-01

    We have implemented an on-site early warning algorithm using the infrastructure of the Caltech/USGS Southern California Seismic Network (SCSN). We are evaluating the real-time performance of the software system and the algorithm for rapid assessment of earthquakes. In addition, we are interested in understanding what parts of the SCSN need to be improved to make early warning practical. Our EEW processing system is composed of many independent programs that process waveforms in real-time. The codes were generated by using a software framework. The Pd (maximum displacement amplitude of P wave during the first 3sec) and Tau-c (a period parameter during the first 3 sec) values determined during the EEW processing are being forwarded to the California Integrated Seismic Network (CISN) web page for independent evaluation of the results. The on-site algorithm measures the amplitude of the P-wave (Pd) and the frequency content of the P-wave during the first three seconds (Tau-c). The Pd and the Tau-c values make it possible to discriminate between a variety of events such as large distant events, nearby small events, and potentially damaging nearby events. The Pd can be used to infer the expected maximum ground shaking. The method relies on data from a single station although it will become more reliable if readings from several stations are associated. To eliminate false triggers from stations with high background noise level, we have created per station Pd threshold configuration for the Pd/Tau-c algorithm. To determine appropriate values for the Pd threshold we calculate Pd thresholds for stations based on the information from the EEW logs. We have operated our EEW test system for about a year and recorded numerous earthquakes in the magnitude range from M3 to M5. Two recent examples are a M4.5 earthquake near Chatsworth and a M4.7 earthquake near Elsinore. In both cases, the Pd and Tau-c parameters were determined successfully within 10 to 20 sec of the arrival of the

  8. FIELD SURVEY REPORT OF TSUNAMI EFFECTS CAUSED BY THE AUGUST 2012 OFFSHORE EL SALAVADOR EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    Francisco Gavidia-Medina

    2015-10-01

    Full Text Available This report describes the field survey of the western zone of El Salvador conducted by an international group of scientists and engineers following the earthquake and tsunami of 27 August 2012 (04:37 UTC, 26 August 10:37 pm local time. The earthquake generated a tsunami with a maximum height of ~ 6 m causing inundation of up to 300 m inland along a 40 km section of coastline in eastern El Salvador. * (Note: Presentation from the 6th International Tsunami Symposium of Tsunami Society International in Costa Rica in Sept. 2014 - based on the Field Survey Report of the tsunami effects caused by the August 2012 Earthquake which were compiled in a report by Jose C. Borrero of the University of California Tsunami Research Center. Contributors to that report and field survey participants included Hermann M. Fritz of the Georgia Institute of Technology, Francisco Gavidia-Medina, Jeniffer Larreynaga-Murcia, Rodolfo Torres-Cornejo, Manuel Diaz-Flores and Fabio Alvarad: of the Ministerio de Medio Ambiente y Recursos Naturales de El Salvador (MARN, Norwin Acosta: of the Instituto Nicaragüense de Estudios Territoriales( INOTER, Julie Leonard of the Office of Foreign Disaster Assistance (USAID, OFDA, Nic Arcos of the International Tsunami Information Center (ITIC and Diego Arcas of the Pacific Marine Environmental Laboratory (NOAA – PMEL The figures of this paper are from the report compiled by Jose C. Borrero and are numbered out of sequence out of sequence from the compiled joint report. The quality of figures 2.2, 2.3 and 2.4 is rather poor and the reader is referred to the original report, as shown in the references.

  9. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  10. Observations of an ionospheric perturbation arising from the Coalinga earthquake of May 2, 1983

    International Nuclear Information System (INIS)

    Wolcott, J.H.; Simons, D.J.; Lee, D.D.; Nelson, R.A.

    1984-01-01

    An ionospheric perturbation that was produced by the Coalinga earthquake of May 2, 1983, was detected by a network of high-frequency radio links in northern California. The ionospheric refraction regions of all five HF propagation paths, at distances between 160 and 285 km (horizontal range) from the epicenter, were affected by a ground-motion-induced acoustic pulse that propagated to ionospheric heights. The acoustic pulse was produced by the earthquake-induced seismic waves rather than the vertical ground motion above the epicenter. These observations appear to be the first ionospheric disturbances to be reported this close to an earthquake epicenter

  11. Determination of Focal Depths of Earthquakes in the Mid-Oceanic Ridges from Amplitude Spectra of Surface Waves

    Science.gov (United States)

    1969-06-01

    Foreshock , mainshock and aftershock of the Parkfield, California earthquake of June 28, 1966. b. The Denver earthquake of August 9, 1967. Let us look...into the results of these tests in more details. (1) Test on the main shock, foreshock and aftershock of the Parkfield earthquake of June 28, 1966...According to McEvilly et. al. (1967), the origin times and locations of.these events were the following: Foreshock June 28, 1966, 04:08:56.2 GMT; 350 57.6

  12. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  13. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

    Science.gov (United States)

    Nakamura, Y.; Tucker, B. E.

    1988-01-01

    In Japan, the level of public awareness of the dangers of earthquakes is high. The 1923 Kanto earthquake killed about 120,000 people out of a total Japanese population of about 50 million; an equivalent disaster in the U.S would involve 600,000 deaths.

  14. Irregular recurrence of large earthquakes along the san andreas fault: evidence from trees.

    Science.gov (United States)

    Jacoby, G C; Sheppard, P R; Sieh, K E

    1988-07-08

    Old trees growing along the San Andreas fault near Wrightwood, California, record in their annual ring-width patterns the effects of a major earthquake in the fall or winter of 1812 to 1813. Paleoseismic data and historical information indicate that this event was the "San Juan Capistrano" earthquake of 8 December 1812, with a magnitude of 7.5. The discovery that at least 12 kilometers of the Mojave segment of the San Andreas fault ruptured in 1812, only 44 years before the great January 1857 rupture, demonstrates that intervals between large earthquakes on this part of the fault are highly variable. This variability increases the uncertainty of forecasting destructive earthquakes on the basis of past behavior and accentuates the need for a more fundamental knowledge of San Andreas fault dynamics.

  15. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  16. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  17. A Test Case for the Source Inversion Validation: The 2014 ML 5.5 Orkney, South Africa Earthquake

    Science.gov (United States)

    Ellsworth, W. L.; Ogasawara, H.; Boettcher, M. S.

    2017-12-01

    The ML5.5 earthquake of August 5, 2014 occurred on a near-vertical strike slip fault below abandoned and active gold mines near Orkney, South Africa. A dense network of surface and in-mine seismometers recorded the earthquake and its aftershock sequence. In-situ stress measurements and rock samples through the damage zone and rupture surface are anticipated to be available from the "Drilling into Seismogenic Zones of M2.0-M5.5 Earthquakes in South African gold mines" project (DSeis) that is currently progressing toward the rupture zone (Science, doi: 10.1126/science.aan6905). As of 24 July, 95% of drilled core has been recovered from a 427m-section of the 1st hole from 2.9 km depth with minimal core discing and borehole breakouts. A 2nd hole is planned to intersect the fault at greater depth. Absolute differential stress will be measured along the holes and frictional characteristics of the recovered core will be determined in the lab. Surface seismic reflection data and exploration drilling from the surface down to the mining horizon at 3km depth is also available to calibrate the velocity structure above the mining horizon and image reflective geological boundaries and major faults below the mining horizon. The remarkable quality and range of geophysical data available for the Orkney earthquake makes this event an ideal test case for the Source Inversion Validation community using actual seismic data to determine the spatial and temporal evolution of earthquake rupture. We invite anyone with an interest in kinematic modeling to develop a rupture model for the Orkney earthquake. Seismic recordings of the earthquake and information on the faulting geometry can be found in Moyer et al. (2017, doi: 10.1785/0220160218). A workshop supported by the Southern California Earthquake Center will be held in the spring of 2018 to compare kinematic models. Those interested in participating in the modeling exercise and the workshop should contact the authors for additional

  18. Earthquake correlations and networks: A comparative study

    International Nuclear Information System (INIS)

    Krishna Mohan, T. R.; Revathi, P. G.

    2011-01-01

    We quantify the correlation between earthquakes and use the same to extract causally connected earthquake pairs. Our correlation metric is a variation on the one introduced by Baiesi and Paczuski [M. Baiesi and M. Paczuski, Phys. Rev. E 69, 066106 (2004)]. A network of earthquakes is then constructed from the time-ordered catalog and with links between the more correlated ones. A list of recurrences to each of the earthquakes is identified employing correlation thresholds to demarcate the most meaningful ones in each cluster. Data pertaining to three different seismic regions (viz., California, Japan, and the Himalayas) are comparatively analyzed using such a network model. The distribution of recurrence lengths and recurrence times are two of the key features analyzed to draw conclusions about the universal aspects of such a network model. We find that the unimodal feature of recurrence length distribution, which helps to associate typical rupture lengths with different magnitude earthquakes, is robust across the different seismic regions. The out-degree of the networks shows a hub structure rooted on the large magnitude earthquakes. In-degree distribution is seen to be dependent on the density of events in the neighborhood. Power laws, with two regimes having different exponents, are obtained with recurrence time distribution. The first regime confirms the Omori law for aftershocks while the second regime, with a faster falloff for the larger recurrence times, establishes that pure spatial recurrences also follow a power-law distribution. The crossover to the second power-law regime can be taken to be signaling the end of the aftershock regime in an objective fashion.

  19. Clustering and periodic recurrence of microearthquakes on the san andreas fault at parkfield, california.

    Science.gov (United States)

    Nadeau, R M; Foxall, W; McEvilly, T V

    1995-01-27

    The San Andreas fault at Parkfield, California, apparently late in an interval between repeating magnitude 6 earthquakes, is yielding to tectonic loading partly by seismic slip concentrated in a relatively sparse distribution of small clusters (<20-meter radius) of microearthquakes. Within these clusters, which account for 63% of the earthquakes in a 1987-92 study interval, virtually identical small earthquakes occurred with a regularity that can be described by the statistical model used previously in forecasting large characteristic earthquakes. Sympathetic occurrence of microearthquakes in nearby clusters was observed within a range of about 200 meters at communication speeds of 10 to 100 centimeters per second. The rate of earthquake occurrence, particularly at depth, increased significantly during the study period, but the fraction of earthquakes that were cluster members decreased.

  20. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion

    Science.gov (United States)

    Borcherdt, Roger D.

    1994-01-01

    Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed $5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits

  1. Measuring the effectiveness of earthquake forecasting in insurance strategies

    Science.gov (United States)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  2. Locating Very-Low-Frequency Earthquakes in the San Andreas Fault.

    Science.gov (United States)

    Peña-Castro, A. F.; Harrington, R. M.; Cochran, E. S.

    2016-12-01

    The portion of tectonic fault where rheological properties transtition from brittle to ductile hosts a variety of seismic signals suggesting a range of slip velocities. In subduction zones, the two dominantly observed seismic signals include very-low frequency earthquakes ( VLFEs), and low-frequency earthquakes (LFEs) or tectonic tremor. Tremor and LFE are also commonly observed in transform faults, however, VLFEs have been reported dominantly in subduction zone environments. Here we show some of the first known observations of VLFEs occurring on a plate boundary transform fault, the San Andreas Fault (SAF) between the Cholame-Parkfield segment in California. We detect VLFEs using both permanent and temporary stations in 2010-2011 within approximately 70 km of Cholame, California. We search continous waveforms filtered from 0.02-0.05 Hz, and remove time windows containing teleseismic events and local earthquakes, as identified in the global Centroid Moment Tensor (CMT) and the Northern California Seismic Network (NCSN) catalog. We estimate the VLFE locations by converting the signal into envelopes, and cross-correlating them for phase-picking, similar to procedures used for locating tectonic tremor. We first perform epicentral location using a grid search method and estimate a hypocenter location using Hypoinverse and a shear-wave velocity model when the epicenter is located close to the SAF trace. We account for the velocity contrast across the fault using separate 1D velocity models for stations on each side. Estimated hypocentral VLFE depths are similar to tremor catalog depths ( 15-30 km). Only a few VLFEs produced robust hypocentral locations, presumably due to the difficulty in picking accurate phase arrivals with such a low-frequency signal. However, for events for which no location could be obtained, the moveout of phase arrivals across the stations were similar in character, suggesting that other observed VLFEs occurred in close proximity.

  3. A Case Study of Key Stakeholders' Perceptions of the Learning Center's Effectiveness for English Learners at a District in Central California

    Science.gov (United States)

    Nava, Norma Leticia

    2016-01-01

    This qualitative study explored stakeholders' (administrators, teachers, and parents) perspectives of English learners in the learning center, a response to intervention model, at a school district in Central California. Research existed concerning the yearly academic growth of students in a learning center, but there was a lack of knowledge about…

  4. Spatial organization of foreshocks as a tool to forecast large earthquakes.

    Science.gov (United States)

    Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models.

  5. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    Science.gov (United States)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  6. The Development of Several Electromagnetic Monitoring Strategies and Algorithms for Validating Pre-Earthquake Electromagnetic Signals

    Science.gov (United States)

    Bleier, T. E.; Dunson, J. C.; Roth, S.; Mueller, S.; Lindholm, C.; Heraud, J. A.

    2012-12-01

    QuakeFinder, a private research group in California, reports on the development of a 100+ station network consisting of 3-axis induction magnetometers, and air conductivity sensors to collect and characterize pre-seismic electromagnetic (EM) signals. These signals are combined with daily Infra Red signals collected from the GOES weather satellite infrared (IR) instrument to compare and correlate with the ground EM signals, both from actual earthquakes and boulder stressing experiments. This presentation describes the efforts QuakeFinder has undertaken to automatically detect these pulse patterns using their historical data as a reference, and to develop other discriminative algorithms that can be used with air conductivity sensors, and IR instruments from the GOES satellites. The overall big picture results of the QuakeFinder experiment are presented. In 2007, QuakeFinder discovered the occurrence of strong uni-polar pulses in their magnetometer coil data that increased in tempo dramatically prior to the M5.1 earthquake at Alum Rock, California. Suggestions that these pulses might have been lightning or power-line arcing did not fit with the data actually recorded as was reported in Bleier [2009]. Then a second earthquake occurred near the same site on January 7, 2010 as was reported in Dunson [2011], and the pattern of pulse count increases before the earthquake occurred similarly to the 2007 event. There were fewer pulses, and the magnitude of them was decreased, both consistent with the fact that the earthquake was smaller (M4.0 vs M5.4) and farther away (7Km vs 2km). At the same time similar effects were observed at the QuakeFinder Tacna, Peru site before the May 5th, 2010 M6.2 earthquake and a cluster of several M4-5 earthquakes.

  7. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Aftershocks and Postseismic Effects

    Science.gov (United States)

    Reasenberg, Paul A.

    1997-01-01

    While the damaging effects of the earthquake represent a significant social setback and economic loss, the geophysical effects have produced a wealth of data that have provided important insights into the structure and mechanics of the San Andreas Fault system. Generally, the period after a large earthquake is vitally important to monitor. During this part of the seismic cycle, the primary fault and the surrounding faults, rock bodies, and crustal fluids rapidly readjust in response to the earthquake's sudden movement. Geophysical measurements made at this time can provide unique information about fundamental properties of the fault zone, including its state of stress and the geometry and frictional/rheological properties of the faults within it. Because postseismic readjustments are rapid compared with corresponding changes occurring in the preseismic period, the amount and rate of information that is available during the postseismic period is relatively high. From a geophysical viewpoint, the occurrence of the Loma Prieta earthquake in a section of the San Andreas fault zone that is surrounded by multiple and extensive geophysical monitoring networks has produced nothing less than a scientific bonanza. The reports assembled in this chapter collectively examine available geophysical observations made before and after the earthquake and model the earthquake's principal postseismic effects. The chapter covers four broad categories of postseismic effect: (1) aftershocks; (2) postseismic fault movements; (3) postseismic surface deformation; and (4) changes in electrical conductivity and crustal fluids.

  8. Aseismic blocks and destructive earthquakes in the Aegean

    Science.gov (United States)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  9. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    Science.gov (United States)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  10. Interaction of the san jacinto and san andreas fault zones, southern california: triggered earthquake migration and coupled recurrence intervals.

    Science.gov (United States)

    Sanders, C O

    1993-05-14

    Two lines of evidence suggest that large earthquakes that occur on either the San Jacinto fault zone (SJFZ) or the San Andreas fault zone (SAFZ) may be triggered by large earthquakes that occur on the other. First, the great 1857 Fort Tejon earthquake in the SAFZ seems to have triggered a progressive sequence of earthquakes in the SJFZ. These earthquakes occurred at times and locations that are consistent with triggering by a strain pulse that propagated southeastward at a rate of 1.7 kilometers per year along the SJFZ after the 1857 earthquake. Second, the similarity in average recurrence intervals in the SJFZ (about 150 years) and in the Mojave segment of the SAFZ (132 years) suggests that large earthquakes in the northern SJFZ may stimulate the relatively frequent major earthquakes on the Mojave segment. Analysis of historic earthquake occurrence in the SJFZ suggests little likelihood of extended quiescence between earthquake sequences.

  11. The 1987 Whittier Narrows, California, earthquake: A Metropolitan shock

    OpenAIRE

    Hauksson, Egill; Stein, Ross S.

    1989-01-01

    Just 3 hours after the Whittier Narrows earthquake struck, it became clear that a heretofore unseen geological structure was seismically active beneath metropolitan Los Angeles. Contrary to initial expectations of strike-slip or oblique-slip motion on the Whittier fault, whose north end abuts the aftershock zone, the focal mechanism of the mainshock showed pure thrust faulting on a deep gently inclined surface [Hauksson et al., 1988]. This collection of nine research reports spans the spectru...

  12. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  13. Earthquake correlations and networks: A comparative study

    Science.gov (United States)

    Krishna Mohan, T. R.; Revathi, P. G.

    2011-04-01

    We quantify the correlation between earthquakes and use the same to extract causally connected earthquake pairs. Our correlation metric is a variation on the one introduced by Baiesi and Paczuski [M. Baiesi and M. Paczuski, Phys. Rev. E EULEEJ1539-375510.1103/PhysRevE.69.06610669, 066106 (2004)]. A network of earthquakes is then constructed from the time-ordered catalog and with links between the more correlated ones. A list of recurrences to each of the earthquakes is identified employing correlation thresholds to demarcate the most meaningful ones in each cluster. Data pertaining to three different seismic regions (viz., California, Japan, and the Himalayas) are comparatively analyzed using such a network model. The distribution of recurrence lengths and recurrence times are two of the key features analyzed to draw conclusions about the universal aspects of such a network model. We find that the unimodal feature of recurrence length distribution, which helps to associate typical rupture lengths with different magnitude earthquakes, is robust across the different seismic regions. The out-degree of the networks shows a hub structure rooted on the large magnitude earthquakes. In-degree distribution is seen to be dependent on the density of events in the neighborhood. Power laws, with two regimes having different exponents, are obtained with recurrence time distribution. The first regime confirms the Omori law for aftershocks while the second regime, with a faster falloff for the larger recurrence times, establishes that pure spatial recurrences also follow a power-law distribution. The crossover to the second power-law regime can be taken to be signaling the end of the aftershock regime in an objective fashion.

  14. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  15. [Comment on “Should Memphis build for California's earthquakes?”] from S.E. Hough

    Science.gov (United States)

    Hough, Susan E.

    The recent article by Seth Stein, Joseph Tomasello, and Andrew Newman raised thought-provoking questions about one of the most vexing open issues in hazard assessment in the United States: the hazard posed by ostensibly infrequent, large, mid-continental earthquakes. Many of the technical issues raised by this article are addressed by A. D. Frankel in the accompanying comment. I concur with this, and will only address and/or elaborate on a few additional issues here: (1) Detailed paleoseismic investigations have shown that the New Madrid region experienced sequences of large earthquakes around 900 and 1450 A.D.in addition to the historic events in 1811-1812. With a repeat time on the order of 400-500 years, these cannot be considered infrequent events. Paleoseismic investigations also reveal evidence that the prehistoric “events” were also sequences of two to three large earthquakes with a similar overall distribution of liquefaction in the greater New Madrid region as produced by the 1811-1812 sequence [Tuttle et al., 2002]. And if, as evidence suggests, the zone produces characteristic earthquakes, one will not see a commensurate rate of moderate events, as would be the case if seismicity followed the Gutenburg-Richter distribution.

  16. Preliminary analysis of strong-motion recordings from the 28 September 2004 Parkfield, California earthquake

    Science.gov (United States)

    Shakal, A.; Graizer, V.; Huang, M.; Borcherdt, R.; Haddadi, H.; Lin, K.-W.; Stephens, C.; Roffers, P.

    2005-01-01

    The Parkfield 2004 earthquake yielded the most extensive set of strong-motion data in the near-source region of a magnitude 6 earthquake yet obtained. The recordings of acceleration and volumetric strain provide an unprecedented document of the near-source seismic radiation for a moderate earthquake. The spatial density of the measurements alon g the fault zone and in the linear arrays perpendicular to the fault is expected to provide an exceptional opportunity to develop improved models of the rupture process. The closely spaced measurements should help infer the temporal and spatial distribution of the rupture process at much higher resolution than previously possible. Preliminary analyses of the peak a cceleration data presented herein shows that the motions vary significantly along the rupture zone, from 0.13 g to more than 2.5 g, with a map of the values showing that the larger values are concentrated in three areas. Particle motions at the near-fault stations are consistent with bilateral rupture. Fault-normal pulses similar to those observed in recent strike-slip earthquakes are apparent at several of the stations. The attenuation of peak ground acceleration with distance is more rapid than that indicated by some standard relationships but adequately fits others. Evidence for directivity in the peak acceleration data is not strong. Several stations very near, or over, the rupturing fault recorded relatively low accelerations. These recordings may provide a quantitative basis to understand observations of low near-fault shaking damage that has been reported in other large strike-slip earthquak.

  17. Research on groundwater radon as a fluid phase precursor to earthquakes

    International Nuclear Information System (INIS)

    Teng, T.; Sun, L.

    1986-01-01

    Groundwater radon monitoring work carried out in southern California by the University of Southern California since 1974 is summarized here. This effort began with a sampling network over a locked segment of the San Andreas fault from Tejon to Cajon and was later expanded to cover part of the southern Transverse Mountain ranges. Groundwater samples were brought back weekly to the laboratory for high precision scintillation counting. Needs for more frequent sampling and less labor prompted the development of an economical and field worthy instrument known as the continuous radon monitor. About 10 have been installed in the network since early 1980. The groundwater radon content was found to show anomalous increases (mostly at a single station) before a number of moderate and nearby earthquakes. Our work is hampered by a lack of large earthquakes that may have a regional impact on radon anomalies and by the complexity of the underground hydrological regime. To circumvent this difficulty, we have chosen to monitor only deep artesian wells or hot spring wells

  18. Transient stresses al Parkfield, California, produced by the M 7.4 Landers earthquake of June 28, 1992: implications for the time-dependence of fault friction

    Directory of Open Access Journals (Sweden)

    J. B. Fletcher

    1994-06-01

    Full Text Available he M 7.4 Landers earthquake triggered widespread seismicity in the Western U.S. Because the transient dynamic stresses induced at regional distances by the Landers surface waves are much larger than the expected static stresses, the magnitude and the characteristics of the dynamic stresses may bear upon the earthquake triggering mechanism. The Landers earthquake was recorded on the UPSAR array, a group of 14 triaxial accelerometers located within a 1-square-km region 10 km southwest of the town of Parkfield, California, 412 km northwest of the Landers epicenter. We used a standard geodetic inversion procedure to determine the surface strain and stress tensors as functions of time from the observed dynamic displacements. Peak dynamic strains and stresses at the Earth's surface are about 7 microstrain and 0.035 MPa, respectively, and they have a flat amplitude spectrum between 2 s and 15 s period. These stresses agree well with stresses predicted from a simple rule of thumb based upon the ground velocity spectrum observed at a single station. Peak stresses ranged from about 0.035 MPa at the surface to about 0.12 MPa between 2 and 14 km depth, with the sharp increase of stress away from the surface resulting from the rapid increase of rigidity with depth and from the influence of surface wave mode shapes. Comparison of Landers-induced static and dynamic stresses at the hypocenter of the Big Bear aftershock provides a clear example that faults are stronger on time scales of tens of seconds than on time scales of hours or longer.

  19. ARMA models for earthquake ground motions. Seismic Safety Margins Research Program

    International Nuclear Information System (INIS)

    Chang, Mark K.; Kwiatkowski, Jan W.; Nau, Robert F.; Oliver, Robert M.; Pister, Karl S.

    1981-02-01

    This report contains an analysis of four major California earthquake records using a class of discrete linear time-domain processes commonly referred to as ARMA (Autoregressive/Moving-Average) models. It has been possible to analyze these different earthquakes, identify the order of the appropriate ARMA model(s), estimate parameters and test the residuals generated by these models. It has also been possible to show the connections, similarities and differences between the traditional continuous models (with parameter estimates based on spectral analyses) and the discrete models with parameters estimated by various maximum likelihood techniques applied to digitized acceleration data in the time domain. The methodology proposed in this report is suitable for simulating earthquake ground motions in the time domain and appears to be easily adapted to serve as inputs for nonlinear discrete time models of structural motions. (author)

  20. Living With Earthquakes in the Pacific Northwest: A Survivor's Guide, 2nd edition

    Science.gov (United States)

    Hutton, Kate

    In 1995, Robert S.Yeats found himself teaching a core curriculum class at Oregon State University for undergraduate nonscience majors, linking recent discoveries on the earthquake hazard in the Pacific Northwest to societal response to those hazards. The notes for that course evolved into the first edition of this book, published in 1998. In 2001, he published a similar book, Living With Earthquakes in California: A Survivors Guide (Oregon State University Press).Recent earthquakes, such as the 2001 Nisqually Mw6.8, discoveries, and new techniques in paleoseismology plus changes in public policy decisions, quickly outdated the first Pacific Northwest edition. This is especially true with the Cascadia Subduction Zone and crustal faults, where our knowledge expands with every scientific meeting.

  1. Investigating Earthquake-induced Landslides­a Historical Review

    Science.gov (United States)

    Keefer, D. K.; Geological Survey, Us; Park, Menlo; Usa, Ca

    Although earthquake-induced landslides have been described in documents for more than 3700 years, accounts from earthquakes before the late eighteenth century are incomplete concerning landslide numbers and vague concerning landslide character- istics. They are thus typically misleading concerning the true abundance of landslides and range of landslide characteristics. Beginning with studies of the 1783 Calabria, Italy earthquake, more complete and precise data concerning the occurrence of land- slides in earthquakes have become available. The historical development of knowl- edge concerning landslides triggered by earthquakes can be divided into several peri- ods. The first period, from 1783 until the first application of aerial photography, was characterized by ground-based studies of earthquake effects, typically carried out by formal scientific commissions. These formal studies typically identified a large, but not necessarily comprehensive, sampling of localities where landslides had occurred. In some, but not all cases, landslide characteristics were also described in enough de- tail that the general range of landslide characteristics could begin to be determined. More recently, some nineteenth to mid-twentieth century earthquakes have been stud- ied using retrospective analyses, in which the landslide occurrences associated with the event are inferred years to decades later, using contemporary accounts, mapping from aerial photographs, statistical studies, and (or) geotechnical analyses. The first use of aerial photographs to map earthquake effects immediately after the event prob- ably occurred in 1948. Since that time, the use of aerial photography has greatly facil- itated the compilation of post-earthquake landslide inventories, although because of the limitations of aerial photography, ground-based field studies continue to be cru- cial in preparing accurate and comprehensive landslide maps. Beginning with a small California earthquake in 1957

  2. Earthquake strikes at China's energy centers. [Tangshan, July 1976

    Energy Technology Data Exchange (ETDEWEB)

    Smil, V.

    1976-12-01

    The earthquake that struck Hopei province in China on July 28, 1976 must have caused damage that will have wide repercussions for a long time. It came at the beginning of a new Five-Year Plan and struck one of the country's key industrial centers, Tangshan, a city of one million people on the western edge of the 2275 km/sup 2/ Kailvan coalfield. In reveiwing statistics on mining operations in that area, it is known that, after the stagnation of the early 1960s, the output had been growing by an average of more than one million tons each year since 1966. In 1971 a decision was made to double the designed capacity in five years, and a variety of technical innovations and organizational improvements has been undertaken in all Kailvan mines. Damage was reported heavy in Tientsin, China's third largest city, a major power generation center and the site of a new petrochemical complex. Takang, one of China's giant oil fields, producing currently about five percent of the country's crude oil, is located in the Tientsin municipality on the shores of Po Hai. Chinwangtao, some 120 km from the epicenter, is an important oil terminal for the shipments of Taching crude oil and the starting point of the final section of the Taching-Peking pipeline, which supplies the capital's huge Tungfanghung petrochemical complex. Damage is not known to this pipeline or to the extensive high-voltage grid in the area. (MCW)

  3. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  4. Short-period strain (0.1-105 s): Near-source strain field for an earthquake (M L 3.2) near San Juan Bautista, California

    Science.gov (United States)

    Johnston, M. J. S.; Borcherdt, R. D.; Linde, A. T.

    1986-10-01

    Measurements of dilational earth strain in the frequency band 25-10-5 Hz have been made on a deep borehole strainmeter installed near the San Andreas fault. These data are used to determine seismic radiation fields during nuclear explosions, teleseisms, local earthquakes, and ground noise during seismically quiet times. Strains of less than 10-10 on these instruments can be clearly resolved at short periods (< 10 s) and are recorded with wide dynamic range digital recorders. This permits measurement of the static and dynamic strain variations in the near field of local earthquakes. Noise spectra for earth strain referenced to 1 (strain)2/Hz show that strain resolution decreases at about 10 dB per decade of frequency from -150 dB at 10-4 Hz to -223 dB at 10 Hz. Exact expressions are derived to relate the volumetric strain and displacement field for a homogeneous P wave in a general viscoelastic solid as observed on colocated dilatometers and seismometers. A rare near-field recording of strain and seismic velocity was obtained on May 26, 1984, from an earthquake (ML 3.2) at a hypocentral distance of 3.2 km near the San Andreas fault at San Juan Bautista, California. While the data indicate no precursory strain release at the 5 × 10-11 strain level, a coseismic strain release of 1.86 nanostrain was observed. This change in strain is consistent with that calculated from a simple dislocation model of the event. Ground displacement spectra, determined from the downhole strain data and instrument-corrected surface seismic data, suggest that source parameters estimated from surface recordings may be contaminated by amplification effects in near-surface low-velocity materials.

  5. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  6. Rupture directivity and slip distribution of the M 4.3 foreshock to the 1992 Joshua Tree earthquake, Southern California

    Science.gov (United States)

    Mori, J.

    1996-01-01

    Details of the M 4.3 foreshock to the Joshua Tree earthquake were studied using P waves recorded on the Southern California Seismic Network and the Anza network. Deconvolution, using an M 2.4 event as an empirical Green's function, corrected for complicated path and site effects in the seismograms and produced simple far-field displacement pulses that were inverted for a slip distribution. Both possible fault planes, north-south and east-west, for the focal mechanism were tested by a least-squares inversion procedure with a range of rupture velocities. The results showed that the foreshock ruptured the north-south plane, similar to the mainshock. The foreshock initiated a few hundred meters south of the mainshock and ruptured to the north, toward the mainshock hypocenter. The mainshock (M 6.1) initiated near the northern edge of the foreshock rupture 2 hr later. The foreshock had a high stress drop (320 to 800 bars) and broke a small portion of the fault adjacent to the mainshock but was not able to immediately initiate the mainshock rupture.

  7. Ground water level, Water storage, Soil moisture, Precipitation Variability Using Multi Satellite Data during 2003-2016 Associated with California Drought

    Science.gov (United States)

    Li, J. W.; Singh, R. P.

    2017-12-01

    The agricultural market of California is a multi-billion-dollar industry, however in the recent years, the state is facing severe drought. It is important to have a deeper understanding of how the agriculture is affected by the amount of rainfall as well as the ground conditions in California. We have considered 5 regions (each 2 degree by 2 degree) covering whole of California. Multi satellite (MODIS Terra, GRACE, GLDAS) data through NASA Giovanni portal were used to study long period variability 2003 - 2016 of ground water level and storage, soil moisture, root zone moisture level, precipitation and normalized vegetation index (NDVI) in these 5 regions. Our detailed analysis of these parameters show a strong correlation between the NDVI and some of these parameters. NDVI represents greenness showing strong drought conditions during the period 2011-2016 due to poor rainfall and recharge of ground water in the mid and southern parts of California. Effect of ground water level and underground storage will be also discussed on the frequency of earthquakes in five regions of California. The mid and southern parts of California show increasing frequency of small earthquakes during drought periods.

  8. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  9. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.

  10. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  11. Low Velocity Zones along the San Jacinto Fault, Southern California, inferred from Local Earthquakes

    Science.gov (United States)

    Li, Z.; Yang, H.; Peng, Z.; Ben-Zion, Y.; Vernon, F.

    2013-12-01

    Natural fault zones have regions of brittle damage leading to a low-velocity zone (LVZ) in the immediate vicinity of the main fault interface. The LVZ may amplify ground motion, modify rupture propagation, and impact derivation of earthquke properties. Here we image low-velocity fault zone structures along the San Jacinto Fault (SJF), southern California, using waveforms of local earthquakes that are recorded at several dense arrays across the SJFZ. We use generalized ray theory to compute synthetic travel times to track the direct and FZ-reflected waves bouncing from the FZ boundaries. This method can effectively reduce the trade-off between FZ width and velocity reduction relative to the host rock. Our preliminary results from travel time modeling show the clear signature of LVZs along the SJF, including the segment of the Anza seismic gap. At the southern part near the trifrication area, the LVZ of the Clark Valley branch (array JF) has a width of ~200 m with ~55% reduction in Vp and Vs. This is consistent with what have been suggested from previous studies. In comparison, we find that the velocity reduction relative to the host rock across the Anza seismic gap (array RA) is ~50% for both Vp and Vs, nearly as prominent as that on the southern branches. The width of the LVZ is ~230 m. In addition, the LVZ across the Anza gap appears to locate in the northeast side of the RA array, implying potential preferred propagation direction of past ruptures.

  12. New streams and springs after the 2014 Mw6.0 South Napa earthquake.

    Science.gov (United States)

    Wang, Chi-Yuen; Manga, Michael

    2015-07-09

    Many streams and springs, which were dry or nearly dry before the 2014 Mw6.0 South Napa earthquake, started to flow after the earthquake. A United States Geological Survey stream gauge also registered a coseismic increase in discharge. Public interest was heightened by a state of extreme drought in California. Since the new flows were not contaminated by pre-existing surface water, their composition allowed unambiguous identification of their origin. Following the earthquake we repeatedly surveyed the new flows, collecting data to test hypotheses about their origin. We show that the new flows originated from groundwater in nearby mountains released by the earthquake. The estimated total amount of new water is ∼ 10(6) m(3), about 1/40 of the annual water use in the Napa-Sonoma area. Our model also makes a testable prediction of a post-seismic decrease of seismic velocity in the shallow crust of the affected region.

  13. Roles of Radon-222 and other natural radionuclides in earthquake prediction

    International Nuclear Information System (INIS)

    Smith, A.R.; Wollenberg, H.A.; Mosier, D.F.

    1980-01-01

    The concentration of 222 Rn in subsurface waters is one of the natural parameters being investigated to help develop the capability to predict destructive earthquakes. Since 1966, scientists in several nations have sought to link radon variations with ongoing seismic activity, primarily through the dilatancy model for earthquake occurrences. Within the range of these studies, alpha-, beta-, and gamma-radiation detection techniques have been used in both discrete-sampling and continiuous-monitoring programs. These measured techniques are reviewed in terms of instrumentation adapted to seismic-monitoring purposes. A recent Lawrence Berkeley Laboratory study conducted in central California incorporated discrete sampling of wells in the aftershock area of the 1975 Oroville earthquake and continuous monitoring of water radon in a well on the San Andreas Fault. The results presented show short-term radon variations that may be associated with aftershocks and diurnal changes that may reflect earth tidal forces

  14. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    2Institute of Earthquake Prediction Theory and Mathematical Geophysics, ... located about 70 km from a preceding M7.3 earthquake that occurred in ... local extremes of the seismic density distribution, and in the third approach, CI centers were distributed ...... Bird P 2003 An updated digital model of plate boundaries;.

  15. Potentially induced earthquakes during the early twentieth century in the Los Angeles Basin

    Science.gov (United States)

    Hough, Susan E.; Page, Morgan T.

    2016-01-01

    Recent studies have presented evidence that early to mid‐twentieth‐century earthquakes in Oklahoma and Texas were likely induced by fossil fuel production and/or injection of wastewater (Hough and Page, 2015; Frohlich et al., 2016). Considering seismicity from 1935 onward, Hauksson et al. (2015) concluded that there is no evidence for significant induced activity in the greater Los Angeles region between 1935 and the present. To explore a possible association between earthquakes prior to 1935 and oil and gas production, we first revisit the historical catalog and then review contemporary oil industry activities. Although early industry activities did not induce large numbers of earthquakes, we present evidence for an association between the initial oil boom in the greater Los Angeles area and earthquakes between 1915 and 1932, including the damaging 22 June 1920 Inglewood and 8 July 1929 Whittier earthquakes. We further consider whether the 1933 Mw 6.4 Long Beach earthquake might have been induced, and show some evidence that points to a causative relationship between the earthquake and activities in the Huntington Beach oil field. The hypothesis that the Long Beach earthquake was either induced or triggered by an foreshock cannot be ruled out. Our results suggest that significant earthquakes in southern California during the early twentieth century might have been associated with industry practices that are no longer employed (i.e., production without water reinjection), and do not necessarily imply a high likelihood of induced earthquakes at the present time.

  16. Rupture distribution of the 1977 western Argentina earthquake

    Science.gov (United States)

    Langer, C.J.; Hartzell, S.

    1996-01-01

    Teleseismic P and SH body waves are used in a finite-fault, waveform inversion for the rupture history of the 23 November 1977 western Argentina earthquake. This double event consists of a smaller foreshock (M0 = 5.3 ?? 1026 dyn-cm) followed about 20 s later by a larger main shock (M0 = 1.5 ?? 1027 dyn-cm). Our analysis indicates that these two events occurred on different fault segments: with the foreshock having a strike, dip, and average rake of 345??, 45??E, and 50??, and the main shock 10??, 45??E, and 80??, respectively. The foreshock initiated at a depth of 17 km and propagated updip and to the north. The main shock initiated at the southern end of the foreshock zone at a depth of 25 to 30 km, and propagated updip and unilaterally to the south. The north-south separation of the centroids of the moment release for the foreshock and main shock is about 60 km. The apparent triggering of the main shock by the foreshock is similar to other earthquakes that have involved the failure of multiple fault segments, such as the 1992 Landers, California, earthquake. Such occurrences argue against the use of individual, mapped, surface fault or fault-segment lengths in the determination of the size and frequency of future earthquakes.

  17. Comparison of injury epidemiology between the Wenchuan and Lushan earthquakes in Sichuan, China.

    Science.gov (United States)

    Hu, Yang; Zheng, Xi; Yuan, Yong; Pu, Qiang; Liu, Lunxu; Zhao, Yongfan

    2014-12-01

    We aimed to compare injury characteristics and the timing of admissions and surgeries in the Wenchuan earthquake in 2008 and the Lushan earthquake in 2013. We retrospectively compared the admission and operating times and injury profiles of patients admitted to our medical center during both earthquakes. We also explored the relationship between seismic intensity and injury type. The time from earthquake onset to the peak in patient admissions and surgeries differed between the 2 earthquakes. In the Wenchuan earthquake, injuries due to being struck by objects or being buried were more frequent than other types of injuries, and more patients suffered injuries of the extremities than thoracic injuries or brain trauma. In the Lushan earthquake, falls were the most common injury, and more patients suffered thoracic trauma or brain injuries. The types of injury seemed to vary with seismic intensity, whereas the anatomical location of the injury did not. Greater seismic intensity of an earthquake is associated with longer delay between the event and the peak in patient admissions and surgeries, higher frequencies of injuries due to being struck or buried, and lower frequencies of injuries due to falls and injuries to the chest and brain. These insights may prove useful for planning rescue interventions in trauma centers near the epicenter.

  18. Exploring Earthquakes in Real-Time

    Science.gov (United States)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  19. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  20. Earthquake safety program at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Freeland, G.E.

    1985-01-01

    Within three minutes on the morning of January 24, 1980, an earthquake and three aftershocks, with Richter magnitudes of 5.8, 5.1, 4.0, and 4.2, respectively, struck the Livermore Valley. Two days later, a Richter magnitude 5.4 earthquake occurred, which had its epicenter about 4 miles northwest of the Lawrence Livermore National Laboratory (LLNL). Although no one at the Lab was seriously injured, these earthquakes caused considerable damage and disruption. Masonry and concrete structures cracked and broke, trailers shifted and fell off their pedestals, office ceilings and overhead lighting fell, and bookcases overturned. The Laboratory was suddenly immersed in a site-wide program of repairing earthquake-damaged facilities, and protecting our many employees and the surrounding community from future earthquakes. Over the past five years, LLNL has spent approximately $10 million on its earthquake restoration effort for repairs and upgrades. The discussion in this paper centers upon the earthquake damage that occurred, the clean-up and restoration efforts, the seismic review of LLNL facilities, our site-specific seismic design criteria, computer-floor upgrades, ceiling-system upgrades, unique building seismic upgrades, geologic and seismologic studies, and seismic instrumentation. 10 references

  1. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    Science.gov (United States)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of

  2. A forecast experiment of earthquake activity in Japan under Collaboratory for the Study of Earthquake Predictability (CSEP)

    Science.gov (United States)

    Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.

    2012-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast

  3. Final Report Feasibility Study for the California Wave Energy Test Center (CalWavesm) - Volume #2 - Appendices #16-17

    Energy Technology Data Exchange (ETDEWEB)

    Dooher, Brendan [Pacific Gas and Electric Company, San Ramon, CA (United States). Applied Technical Services; Toman, William I. [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States). Inst. of Advanced Technology and Public Policy; Davy, Doug M. [CH2M Hill Engineers, Inc., Sacramento, CA (United States); Blakslee, Samuel N. [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States)

    2017-07-31

    The California Wave Energy Test Center (CalWave) Feasibility Study project was funded over multiple phases by the Department of Energy to perform an interdisciplinary feasibility assessment to analyze the engineering, permitting, and stakeholder requirements to establish an open water, fully energetic, grid connected, wave energy test center off the coast of California for the purposes of advancing U.S. wave energy research, development, and testing capabilities. Work under this grant included wave energy resource characterization, grid impact and interconnection requirements, port infrastructure and maritime industry capability/suitability to accommodate the industry at research, demonstration and commercial scale, and macro and micro siting considerations. CalWave Phase I performed a macro-siting and down-selection process focusing on two potential test sites in California: Humboldt Bay and Vandenberg Air Force Base. This work resulted in the Vandenberg Air Force Base site being chosen as the most favorable site based on a peer reviewed criteria matrix. CalWave Phase II focused on four siting location alternatives along the Vandenberg Air Force Base coastline and culminated with a final siting down-selection. Key outcomes from this work include completion of preliminary engineering and systems integration work, a robust turnkey cost estimate, shoreside and subsea hazards assessment, storm wave analysis, lessons learned reports from several maritime disciplines, test center benchmarking as compared to existing international test sites, analysis of existing applicable environmental literature, the completion of a preliminary regulatory, permitting and licensing roadmap, robust interaction and engagement with state and federal regulatory agency personnel and local stakeholders, and the population of a Draft Federal Energy Regulatory Commission (FERC) Preliminary Application Document (PAD). Analysis of existing offshore oil and gas infrastructure was also performed

  4. Localization of b-values and maximum earthquakes; B chi to saidai jishin no chiikisei

    Energy Technology Data Exchange (ETDEWEB)

    Kurimoto, H

    1996-05-01

    There is a thought that hourly and spacial blanks in earthquake activity contribute to earthquake occurrence probability. Based on an idea that if so, this tendency may appear also in statistical parameters of earthquake, earthquake activities in every ten years were investigated in the relation between locational distribution of inclined b values of a line relating to the number of earthquake and the magnitude, and the center focus of earthquakes which are M{ge}7.0. The field surveyed is the Japanese Islands and the peripheral ocean, and the area inside the circle with a radius of 100km with a lattice-like point divided in 1{degree} in every direction of latitude and longitude as center was made a unit region. The depth is divided by above 60km or below 60km. As a result, the following were found out: as to epicenters of earthquakes with M{ge}7.0 during the survey period of 100 years, many are in a range of b(b value){le}0.75, and sometimes they may be in a range of b{ge}0.75 in the area from the ocean near Izu peninsula to the ocean off the west Hokkaido; the position of epicenters in a range of b{le}0.75 seems not to come close to the center of contour which indicates the maximum b value. 7 refs., 2 figs.

  5. Ground-rupturing earthquakes on the northern Big Bend of the San Andreas Fault, California, 800 A.D. to Present

    Science.gov (United States)

    Scharer, Katherine M.; Weldon, Ray; Biasi, Glenn; Streig, Ashley; Fumal, Thomas E.

    2017-01-01

    Paleoseismic data on the timing of ground-rupturing earthquakes constrain the recurrence behavior of active faults and can provide insight on the rupture history of a fault if earthquakes dated at neighboring sites overlap in age and are considered correlative. This study presents the evidence and ages for 11 earthquakes that occurred along the Big Bend section of the southern San Andreas Fault at the Frazier Mountain paleoseismic site. The most recent earthquake to rupture the site was the Mw7.7–7.9 Fort Tejon earthquake of 1857. We use over 30 trench excavations to document the structural and sedimentological evolution of a small pull-apart basin that has been repeatedly faulted and folded by ground-rupturing earthquakes. A sedimentation rate of 0.4 cm/yr and abundant organic material for radiocarbon dating contribute to a record that is considered complete since 800 A.D. and includes 10 paleoearthquakes. Earthquakes have ruptured this location on average every ~100 years over the last 1200 years, but individual intervals range from ~22 to 186 years. The coefficient of variation of the length of time between earthquakes (0.7) indicates quasiperiodic behavior, similar to other sites along the southern San Andreas Fault. Comparison with the earthquake chronology at neighboring sites along the fault indicates that only one other 1857-size earthquake could have occurred since 1350 A.D., and since 800 A.D., the Big Bend and Mojave sections have ruptured together at most 50% of the time in Mw ≥ 7.3 earthquakes.

  6. Using focal mechanism solutions to correlate earthquakes with faults in the Lake Tahoe-Truckee area, California and Nevada, and to help design LiDAR surveys for active-fault reconnaissance

    Science.gov (United States)

    Cronin, V. S.; Lindsay, R. D.

    2011-12-01

    Geomorphic analysis of hillshade images produced from aerial LiDAR data has been successful in identifying youthful fault traces. For example, the recently discovered Polaris fault just northwest of Lake Tahoe, California/Nevada, was recognized using LiDAR data that had been acquired by local government to assist land-use planning. Subsequent trenching by consultants under contract to the US Army Corps of Engineers has demonstrated Holocene displacement. The Polaris fault is inferred to be capable of generating a magnitude 6.4-6.9 earthquake, based on its apparent length and offset characteristics (Hunter and others, 2011, BSSA 101[3], 1162-1181). Dingler and others (2009, GSA Bull 121[7/8], 1089-1107) describe paleoseismic or geomorphic evidence for late Neogene displacement along other faults in the area, including the West Tahoe-Dollar Point, Stateline-North Tahoe, and Incline Village faults. We have used the seismo-lineament analysis method (SLAM; Cronin and others, 2008, Env Eng Geol 14[3], 199-219) to establish a tentative spatial correlation between each of the previously mentioned faults, as well as with segments of the Dog Valley fault system, and one or more earthquake(s). The ~18 earthquakes we have tentatively correlated with faults in the Tahoe-Truckee area occurred between 1966 and 2008, with magnitudes between 3 and ~6. Given the focal mechanism solution for a well-located shallow-focus earthquake, the nodal planes can be projected to Earth's surface as represented by a DEM, plus-or-minus the vertical and horizontal uncertainty in the focal location, to yield two seismo-lineament swaths. The trace of the fault that generated the earthquake is likely to be found within one of the two swaths [1] if the fault surface is emergent, and [2] if the fault surface is approximately planar in the vicinity of the focus. Seismo-lineaments from several of the earthquakes studied overlap in a manner that suggests they are associated with the same fault. The surface

  7. Empirical relations between instrumental and seismic parameters of some strong earthquakes of Colombia

    International Nuclear Information System (INIS)

    Marin Arias, Juan Pablo; Salcedo Hurtado, Elkin de Jesus; Castillo Gonzalez, Hardany

    2008-01-01

    In order to establish the relationships between macroseismic and instrumental parameters, macroseismic field of 28 historical earthquakes that produced great effects in the Colombian territory were studied. The integration of the parameters was made by using the methodology of Kaussel and Ramirez (1992), for great Chilean earthquakes; Kanamori and Anderson (1975) and Coppersmith and Well (1994) for world-wide earthquakes. Once determined the macroseismic and instrumental parameters it was come to establish the model of the source of each earthquake, with which the data base of these parameters was completed. For each earthquake parameters related to the local and normal macroseismic epicenter were complemented, depth of the local and normal center, horizontal extension of both centers, vertical extension of the normal center, model of the source, area of rupture. The obtained empirical relations from linear equations, even show behaviors very similar to the found ones by other authors for other regions of the world and to world-wide level. The results of this work allow establishing that certain mutual non compatibility exists between the area of rupture and the length of rupture determined by the macroseismic methods, with parameters found with instrumental data like seismic moment, Ms magnitude and Mw magnitude.

  8. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  9. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    Science.gov (United States)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  10. The 1964 Great Alaska Earthquake and tsunamis: a modern perspective and enduring legacies

    Science.gov (United States)

    Brocher, Thomas M.; Filson, John R.; Fuis, Gary S.; Haeussler, Peter J.; Holzer, Thomas L.; Plafker, George; Blair, J. Luke

    2014-01-01

    The magnitude 9.2 Great Alaska Earthquake that struck south-central Alaska at 5:36 p.m. on Friday, March 27, 1964, is the largest recorded earthquake in U.S. history and the second-largest earthquake recorded with modern instruments. The earthquake was felt throughout most of mainland Alaska, as far west as Dutch Harbor in the Aleutian Islands some 480 miles away, and at Seattle, Washington, more than 1,200 miles to the southeast of the fault rupture, where the Space Needle swayed perceptibly. The earthquake caused rivers, lakes, and other waterways to slosh as far away as the coasts of Texas and Louisiana. Water-level recorders in 47 states—the entire Nation except for Connecticut, Delaware, and Rhode Island— registered the earthquake. It was so large that it caused the entire Earth to ring like a bell: vibrations that were among the first of their kind ever recorded by modern instruments. The Great Alaska Earthquake spawned thousands of lesser aftershocks and hundreds of damaging landslides, submarine slumps, and other ground failures. Alaska’s largest city, Anchorage, located west of the fault rupture, sustained heavy property damage. Tsunamis produced by the earthquake resulted in deaths and damage as far away as Oregon and California. Altogether the earthquake and subsequent tsunamis caused 129 fatalities and an estimated $2.3 billion in property losses (in 2013 dollars). Most of the population of Alaska and its major transportation routes, ports, and infrastructure lie near the eastern segment of the Aleutian Trench that ruptured in the 1964 earthquake. Although the Great Alaska Earthquake was tragic because of the loss of life and property, it provided a wealth of data about subductionzone earthquakes and the hazards they pose. The leap in scientific understanding that followed the 1964 earthquake has led to major breakthroughs in earth science research worldwide over the past half century. This fact sheet commemorates Great Alaska Earthquake and

  11. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  12. California Earthquake Clearinghouse Crisis Information-Sharing Strategy in Support of Situational Awareness, Understanding Interdependencies of Critical Infrastructure, Regional Resilience, Preparedness, Risk Assessment/mitigation, Decision-Making and Everyday Operational Needs

    Science.gov (United States)

    Rosinski, A.; Morentz, J.; Beilin, P.

    2017-12-01

    The principal function of the California Earthquake Clearinghouse is to provide State and Federal disaster response managers, and the scientific and engineering communities, with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes and tsunamis. The overarching problem highlighted in discussions with Clearinghouse partners is the confusion and frustration of many of the Operational Area representatives, and some regional utilities throughout the state on what software applications they should be using and maintaining to meet State, Federal, and Local, requirements, and for what purposes, and how to deal with the limitations of these applications. This problem is getting in the way of making meaningful progress on developing multi-application interoperability and the necessary supporting cross-sector information-sharing procedures and dialogue on essential common operational information that entities need to share for different all hazards missions and related operational activities associated with continuity, security, and resilience. The XchangeCore based system the Clearinghouse is evolving helps deal with this problem, and does not compound it by introducing yet another end-user application; there is no end-user interface with which one views XchangeCore, all viewing of data provided through XchangeCore occurs in and on existing, third-party operational applications. The Clearinghouse efforts with XchangeCore are compatible with FEMA, which is currently using XchangeCore-provided data for regional and National Business Emergency Operations Center (source of business information sharing during emergencies) response. Also important, and should be emphasized, is that information-sharing is not just for response, but for preparedness, risk assessment/mitigation decision-making, and everyday operational needs for situational awareness. In other words, the benefits of the Clearinghouse

  13. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  14. Assessing Threats and Conservation Status of Historical Centers of Oak Richness in California

    Directory of Open Access Journals (Sweden)

    Kelly Jane Easterday

    2016-12-01

    Full Text Available Oak trees are emblematic of California landscapes, they serve as keystone cultural and ecological species and as indicators of natural biological diversity. As historically undeveloped landscapes are increasingly converted to urban environments, endemic oak woodland extent is reduced, which underscores the importance of strategic placement and reintroduction of oaks and woodland landscape for the maintenance of biodiversity and reduction of habitat fragmentation. This paper investigated the effects of human urban development on oak species in California by first modeling historical patterns of richness for eight oak tree species using historical map and plot data from the California Vegetation Type Mapping (VTM collection. We then examined spatial intersections between hot spots of historical oak richness and modern urban and conservation lands and found that impacts from development and conservation vary by both species and richness. Our findings suggest that the impact of urban development on oaks has been small within the areas of highest oak richness but that areas of highest oak richness are also poorly conserved. Third, we argue that current policy measures are inadequate to conserve oak woodlands and suggest regions to prioritize acquisition of conservation lands as well as examine urban regions where historic centers of oak richness were lost as potential frontiers for oak reintroduction. We argue that urban planning could benefit from the adoption of historical data and modern species distribution modelling techniques primarily used in natural resources and conservation fields to better locate hot spots of species richness, understand where habitats and species have been lost historically and use this evidence as incentive to recover what was lost and preserve what still exists. This adoption of historical data and modern techniques would then serve as a paradigm shift in the way Urban Planners recognize, quantify, and use landscape

  15. California's restless giant: the Long Valley Caldera

    Science.gov (United States)

    Hill, David P.; Bailey, Roy A.; Hendley, James W.; Stauffer, Peter H.; Marcaida, Mae

    2014-01-01

    Scientists have monitored geologic unrest in the Long Valley, California, area since 1980. In that year, following a swarm of strong earthquakes, they discovered that the central part of the Long Valley Caldera had begun actively rising. Unrest in the area persists today. The U.S. Geological Survey (USGS) continues to provide the public and civil authorities with current information on the volcanic hazard at Long Valley and is prepared to give timely warnings of any impending eruption.

  16. REMOTE OPERATION OF THE WEST COAST AND ALASKA TSUNAMI WARNING CENTER

    Directory of Open Access Journals (Sweden)

    Alec H. Medbery

    2002-01-01

    Full Text Available The remote control of real time derivation of earthquake location and magnitude and the issuance of tsunami and earthquake bulletins was done using off-the-shelf remote control software and hardware. Such remote operation of the West Coast/Alaska Tsunami Warning Center can decrease the time needed to respond to an earthquake by eliminating travel from the duty standers’ home to the tsunami warning center.

  17. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  18. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  19. Common floor system vertical earthquake-proof structure for reactor equipment

    International Nuclear Information System (INIS)

    Morishita, Masaki.

    1996-01-01

    In an LMFBR type reactor, a reactor container, a recycling pump and a heat exchanger are disposed on a common floor. Vertical earthquake-proof devices which can be stretched only in vertical direction formed by laminating large-sized bellevilles are disposed on a concrete wall at the circumference of each of reactor equipments. A common floor is placed on all of the vertical earthquake-proof devices to support the entire earthquake-proof structure simultaneously. If each of reactor equipments is loaded on the common floor and the common floor is entirely supported against earthquakes altogether, since the movement of each of the reactor equipments loaded on the common floor is identical, relative dislocation is not exerted on the main pipelines which connect the equipments. In addition, since the entire earthquake structure has a flat common floor and each of the reactor equipments is suspended to minimize the distance between a gravitational center and a support point, locking vibration is less caused to the horizontal earthquake. (N.H.)

  20. COMPARING SEA LEVEL RESPONSE AT MONTEREY, CALIFORNIA FROM THE 1989 LOMA PRIETA EARTHQUAKE AND THE 1964 GREAT ALASKAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    L. C. Breaker

    2009-01-01

    Full Text Available Two of the largest earthquakes to affect water levels in Monterey Bay in recent years were the Loma Prieta Earthquake (LPE of 1989 with a moment magnitude of 6.9, and the Great Alaskan Earthquake (GAE of 1964 with a moment magnitude of 9.2. In this study, we compare the sea level response of these events with a primary focus on their frequency content and how the bay affected it, itself. Singular Spectrum Analysis (SSA was employed to extract the primary frequencies associated with each event. It is not clear how or exactly where the tsunami associated with the LPE was generated, but it occurred inside the bay and most likely began to take on the characteristics of a seiche by the time it reached the tide gauge in Monterey Harbor. Results of the SSA decomposition revealed two primary periods of oscillation, 9-10 minutes, and 31-32 minutes. The first oscillation is in agreement with the range of periods for the expected natural oscillations of Monterey Harbor, and the second oscillation is consistent with a bay-wide oscillation or seiche mode. SSA decomposition of the GAE revealed several sequences of oscillations all with a period of approximately 37 minutes, which corresponds to the predicted, and previously observed, transverse mode of oscillation for Monterey Bay. In this case, it appears that this tsunami produced quarter-wave resonance within the bay consistent with its seiche-like response. Overall, the sea level responses to the LPE and GAE differed greatly, not only because of the large difference in their magnitudes but also because the driving force in one case occurred inside the bay (LPE, and in the second, outside the bay (GAE. As a result, different modes of oscillation were excited.

  1. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  2. Energy efficient data centers

    Energy Technology Data Exchange (ETDEWEB)

    Tschudi, William; Xu, Tengfang; Sartor, Dale; Koomey, Jon; Nordman, Bruce; Sezgen, Osman

    2004-03-30

    Data Center facilities, prevalent in many industries and institutions are essential to California's economy. Energy intensive data centers are crucial to California's industries, and many other institutions (such as universities) in the state, and they play an important role in the constantly evolving communications industry. To better understand the impact of the energy requirements and energy efficiency improvement potential in these facilities, the California Energy Commission's PIER Industrial Program initiated this project with two primary focus areas: First, to characterize current data center electricity use; and secondly, to develop a research ''roadmap'' defining and prioritizing possible future public interest research and deployment efforts that would improve energy efficiency. Although there are many opinions concerning the energy intensity of data centers and the aggregate effect on California's electrical power systems, there is very little publicly available information. Through this project, actual energy consumption at its end use was measured in a number of data centers. This benchmark data was documented in case study reports, along with site-specific energy efficiency recommendations. Additionally, other data center energy benchmarks were obtained through synergistic projects, prior PG&E studies, and industry contacts. In total, energy benchmarks for sixteen data centers were obtained. For this project, a broad definition of ''data center'' was adopted which included internet hosting, corporate, institutional, governmental, educational and other miscellaneous data centers. Typically these facilities require specialized infrastructure to provide high quality power and cooling for IT equipment. All of these data center types were considered in the development of an estimate of the total power consumption in California. Finally, a research ''roadmap'' was developed

  3. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    Science.gov (United States)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  4. Tomographic Rayleigh wave group velocities in the Central Valley, California, centered on the Sacramento/San Joaquin Delta

    Science.gov (United States)

    Fletcher, Jon B.; Erdem, Jemile; Seats, Kevin; Lawrence, Jesse

    2016-04-01

    If shaking from a local or regional earthquake in the San Francisco Bay region were to rupture levees in the Sacramento/San Joaquin Delta, then brackish water from San Francisco Bay would contaminate the water in the Delta: the source of freshwater for about half of California. As a prelude to a full shear-wave velocity model that can be used in computer simulations and further seismic hazard analysis, we report on the use of ambient noise tomography to build a fundamental mode, Rayleigh wave group velocity model for the region around the Sacramento/San Joaquin Delta in the western Central Valley, California. Recordings from the vertical component of about 31 stations were processed to compute the spatial distribution of Rayleigh wave group velocities. Complex coherency between pairs of stations was stacked over 8 months to more than a year. Dispersion curves were determined from 4 to about 18 s. We calculated average group velocities for each period and inverted for deviations from the average for a matrix of cells that covered the study area. Smoothing using the first difference is applied. Cells of the model were about 5.6 km in either dimension. Checkerboard tests of resolution, which are dependent on station density, suggest that the resolving ability of the array is reasonably good within the middle of the array with resolution between 0.2 and 0.4°. Overall, low velocities in the middle of each image reflect the deeper sedimentary syncline in the Central Valley. In detail, the model shows several centers of low velocity that may be associated with gross geologic features such as faulting along the western margin of the Central Valley, oil and gas reservoirs, and large crosscutting features like the Stockton arch. At shorter periods around 5.5 s, the model's western boundary between low and high velocities closely follows regional fault geometry and the edge of a residual isostatic gravity low. In the eastern part of the valley, the boundaries of the low

  5. Tomographic Rayleigh-wave group velocities in the Central Valley, California centered on the Sacramento/San Joaquin Delta

    Science.gov (United States)

    Fletcher, Jon Peter B.; Erdem, Jemile; Seats, Kevin; Lawrence, Jesse

    2016-01-01

    If shaking from a local or regional earthquake in the San Francisco Bay region were to rupture levees in the Sacramento/San Joaquin Delta then brackish water from San Francisco Bay would contaminate the water in the Delta: the source of fresh water for about half of California. As a prelude to a full shear-wave velocity model that can be used in computer simulations and further seismic hazard analysis, we report on the use of ambient noise tomography to build a fundamental-mode, Rayleigh-wave group velocity model for the region around the Sacramento/San Joaquin Delta in the western Central Valley, California. Recordings from the vertical component of about 31 stations were processed to compute the spatial distribution of Rayleigh wave group velocities. Complex coherency between pairs of stations were stacked over 8 months to more than a year. Dispersion curves were determined from 4 to about 18 seconds. We calculated average group velocities for each period and inverted for deviations from the average for a matrix of cells that covered the study area. Smoothing using the first difference is applied. Cells of the model were about 5.6 km in either dimension. Checkerboard tests of resolution, which is dependent on station density, suggest that the resolving ability of the array is reasonably good within the middle of the array with resolution between 0.2 and 0.4 degrees. Overall, low velocities in the middle of each image reflect the deeper sedimentary syncline in the Central Valley. In detail, the model shows several centers of low velocity that may be associated with gross geologic features such as faulting along the western margin of the Central Valley, oil and gas reservoirs, and large cross cutting features like the Stockton arch. At shorter periods around 5.5s, the model’s western boundary between low and high velocities closely follows regional fault geometry and the edge of a residual isostatic gravity low. In the eastern part of the valley, the boundaries

  6. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  7. Borehole P- and S-wave velocity at thirteen stations in Southern California

    Science.gov (United States)

    Gibbs, James F.; Boore, David M.; Tinsley, John C.; Mueller, Charles S.

    2001-01-01

    The U.S. Geological Survey (USGS), as part of a program to acquire seismic velocity data at locations of strong-ground motion in earthquakes (e.g., Gibbs et al., 2000), has investigated thirteen additional sites in the Southern California region. Of the thirteen sites, twelve are in the vicinity of Whittier, California, and one is located in San Bernardino, California. Several deployments of temporary seismographs were made after the Whittier Narrows, California earthquake of 1 October 1987 (Mueller et al., 1988). A deployment, between 2 October and 9 November 1987, was the motivation for selection of six of the drill sites. Temporary portable seismographs at Hoover School (HOO), Lincoln School (LIN), Corps of Engineers Station (NAR), Olive Junior High School (OLV), Santa Anita Golf Course (SAG), and Southwestern Academy (SWA) recorded significant aftershock data. These portable sites, with the exception of Santa Anita Golf Course, were co-sited with strong-motion recorders. Stations at HOO, Lincoln School Whittier (WLB), Saint Paul High School (STP), Alisos Adult School (EXC), Cerritos College Gymnasium (CGM), Cerritos College Physical Science Building (CPS), and Cerritos College Police Building (CPB) were part of an array of digital strong-motion stations deployed from "bedrock" in Whittier to near the deepest part of the Los Angeles basin in Norwalk. Although development and siting of this new array (partially installed at the time of this writing) was generally motivated by the Whittier Narrows earthquake, these new sites (with the exception of HOO) were not part of any Whittier Narrows aftershock deployments. A similar new digital strong-motion site was installed at the San Bernardino Fire Station during the same time frame. Velocity data were obtained to depths of about 90 meters at two sites, 30 meters at seven sites, and 18 to 25 meters at four sites. Lithology data from the analysis of cuttings and samples was obtained from the two 90-meter deep holes and

  8. Martin Marietta Paducah Gaseous Diffusion Plant comprehensive earthquake emergency management program

    International Nuclear Information System (INIS)

    1990-01-01

    Recognizing the value of a proactive, integrated approach to earthquake preparedness planning, Martin Marietta Energy Systems, Inc. initiated a contract in June 1989 with Murray State University, Murray, Kentucky, to develop a comprehensive earthquake management program for their Gaseous Diffusion Plant in Paducah, Kentucky. The overall purpose of the program is to mitigate the loss of life and property in the event of a major destructive earthquake. The program includes four distinct (yet integrated) components: an emergency management plan, with emphasis on the catastrophic earthquake; an Emergency Operations Center Duty Roster Manual; an Integrated Automated Emergency Management Information System (IAEMIS); and a series of five training program modules. The PLAN itself is comprised of four separate volumes: Volume I -- Chapters 1--3; Volume II -- Chapters 4--6, Volume III -- Chapter 7, and Volume IV -- 23 Appendices. The EOC Manual (which includes 15 mutual aid agreements) is designated as Chapter 7 in the PLAN and is a ''stand alone'' document numbered as Volume III. This document, Volume I, provides an introduction, summary and recommendations, and the emergency operations center direction and control

  9. Automated Determination of Magnitude and Source Length of Large Earthquakes

    Science.gov (United States)

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.

    2017-12-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  10. Earthquake early warning system using real-time signal processing

    Energy Technology Data Exchange (ETDEWEB)

    Leach, R.R. Jr.; Dowla, F.U.

    1996-02-01

    An earthquake warning system has been developed to provide a time series profile from which vital parameters such as the time until strong shaking begins, the intensity of the shaking, and the duration of the shaking, can be derived. Interaction of different types of ground motion and changes in the elastic properties of geological media throughout the propagation path result in a highly nonlinear function. We use neural networks to model these nonlinearities and develop learning techniques for the analysis of temporal precursors occurring in the emerging earthquake seismic signal. The warning system is designed to analyze the first-arrival from the three components of an earthquake signal and instantaneously provide a profile of impending ground motion, in as little as 0.3 sec after first ground motion is felt at the sensors. For each new data sample, at a rate of 25 samples per second, the complete profile of the earthquake is updated. The profile consists of a magnitude-related estimate as well as an estimate of the envelope of the complete earthquake signal. The envelope provides estimates of damage parameters, such as time until peak ground acceleration (PGA) and duration. The neural network based system is trained using seismogram data from more than 400 earthquakes recorded in southern California. The system has been implemented in hardware using silicon accelerometers and a standard microprocessor. The proposed warning units can be used for site-specific applications, distributed networks, or to enhance existing distributed networks. By producing accurate, and informative warnings, the system has the potential to significantly minimize the hazards of catastrophic ground motion. Detailed system design and performance issues, including error measurement in a simple warning scenario are discussed in detail.

  11. Direct and indirect evidence for earthquakes; an example from the Lake Tahoe Basin, California-Nevada

    Science.gov (United States)

    Maloney, J. M.; Noble, P. J.; Driscoll, N. W.; Kent, G.; Schmauder, G. C.

    2012-12-01

    High-resolution seismic CHIRP data can image direct evidence of earthquakes (i.e., offset strata) beneath lakes and the ocean. Nevertheless, direct evidence often is not imaged due to conditions such as gas in the sediments, or steep basement topography. In these cases, indirect evidence for earthquakes (i.e., debris flows) may provide insight into the paleoseismic record. The four sub-basins of the tectonically active Lake Tahoe Basin provide an ideal opportunity to image direct evidence for earthquake deformation and compare it to indirect earthquake proxies. We present results from high-resolution seismic CHIRP surveys in Emerald Bay, Fallen Leaf Lake, and Cascade Lake to constrain the recurrence interval on the West Tahoe Dollar Point Fault (WTDPF), which was previously identified as potentially the most hazardous fault in the Lake Tahoe Basin. Recently collected CHIRP profiles beneath Fallen Leaf Lake image slide deposits that appear synchronous with slides in other sub-basins. The temporal correlation of slides between multiple basins suggests triggering by events on the WTDPF. If correct, we postulate a recurrence interval for the WTDPF of ~3-4 k.y., indicating that the WTDPF is near its seismic recurrence cycle. In addition, CHIRP data beneath Cascade Lake image strands of the WTDPF that offset the lakefloor as much as ~7 m. The Cascade Lake data combined with onshore LiDAR allowed us to map the geometry of the WTDPF continuously across the southern Lake Tahoe Basin and yielded an improved geohazard assessment.

  12. Reevaluation of the macroseismic effects of the 1887 Sonora, Mexico earthquake and its magnitude estimation

    Science.gov (United States)

    Suárez, Gerardo; Hough, Susan E.

    2008-01-01

    The Sonora, Mexico, earthquake of 3 May 1887 occurred a few years before the start of the instrumental era in seismology. We revisit all available accounts of the earthquake and assign Modified Mercalli Intensities (MMI), interpreting and analyzing macroseismic information using the best available modern methods. We find that earlier intensity assignments for this important earthquake were unjustifiably high in many cases. High intensity values were assigned based on accounts of rock falls, soil failure or changes in the water table, which are now known to be very poor indicators of shaking severity and intensity. Nonetheless, reliable accounts reveal that light damage (intensity VI) occurred at distances of up to ~200 km in both Mexico and the United States. The resulting set of 98 reevaluated intensity values is used to draw an isoseismal map of this event. Using the attenuation relation proposed by Bakun (2006b), we estimate an optimal moment magnitude of Mw7.6. Assuming this magnitude is correct, a fact supported independently by documented rupture parameters assuming standard scaling relations, our results support the conclusion that northern Sonora as well as the Basin and Range province are characterized by lower attenuation of intensities than California. However, this appears to be at odds with recent results that Lg attenuation in the Basin and Range province is comparable to that in California.

  13. Observing earthquakes triggered in the near field by dynamic deformations

    Science.gov (United States)

    Gomberg, J.; Bodin, P.; Reasenberg, P.A.

    2003-01-01

    We examine the hypothesis that dynamic deformations associated with seismic waves trigger earthquakes in many tectonic environments. Our analysis focuses on seismicity at close range (within the aftershock zone), complementing published studies of long-range triggering. Our results suggest that dynamic triggering is not confined to remote distances or to geothermal and volcanic regions. Long unilaterally propagating ruptures may focus radiated dynamic deformations in the propagation direction. Therefore, we expect seismicity triggered dynamically by a directive rupture to occur asymmetrically, with a majority of triggered earthquakes in the direction of rupture propagation. Bilaterally propagating ruptures also may be directive, and we propose simple criteria for assessing their directivity. We compare the inferred rupture direction and observed seismicity rate change following 15 earthquakes (M 5.7 to M 8.1) that occured in California and Idaho in the United States, the Gulf of Aqaba, Syria, Guatemala, China, New Guinea, Turkey, Japan, Mexico, and Antarctica. Nine of these mainshocks had clearly directive, unilateral ruptures. Of these nine, seven apparently induced an asymmetric increase in seismicity rate that correlates with the rupture direction. The two exceptions include an earthquake preceded by a comparable-magnitude event on a conjugate fault and another for which data limitations prohibited conclusive results. Similar (but weaker) correlations were found for the bilaterally rupturing earthquakes we studied. Although the static stress change also may trigger seismicity, it and the seismicity it triggers are expected to be similarly asymmetric only if the final slip is skewed toward the rupture terminus. For several of the directive earthquakes, we suggest that the seismicity rate change correlates better with the dynamic stress field than the static stress change.

  14. Hazus® estimated annualized earthquake losses for the United States

    Science.gov (United States)

    Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean

    2017-01-01

    Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the

  15. Evaluation of Routine HIV Opt-Out Screening and Continuum of Care Services Following Entry into Eight Prison Reception Centers--California, 2012.

    Science.gov (United States)

    Lucas, Kimberley D; Eckert, Valorie; Behrends, Czarina N; Wheeler, Charlotte; MacGowan, Robin J; Mohle-Boetani, Janet C

    2016-02-26

    Early diagnosis of human immunodeficiency virus (HIV) infection and initiation of antiretroviral treatment (ART) improves health outcomes and prevents HIV transmission. Before 2010, HIV testing was available to inmates in the California state prison system upon request. In 2010, the California Correctional Health Care Services (CCHCS) integrated HIV opt-out screening into the health assessment for inmates entering California state prisons. Under this system, a medical care provider informs the inmate that an HIV test is routinely done, along with screening for sexually transmitted, communicable, and vaccine-preventable diseases, unless the inmate specifically declines the test. During 2012-2013, CCHCS, the California Department of Public Health, and CDC evaluated HIV screening, rates of new diagnoses, linkage to and retention in care, ART response, and post-release linkage to care among California prison inmates. All prison inmates are processed through one of eight specialized reception center facilities, where they undergo a comprehensive evaluation of their medical needs, mental health, and custody requirements for placement in one of 35 state prisons. Among 17,436 inmates who entered a reception center during April-September 2012, 77% were screened for HIV infection; 135 (1%) tested positive, including 10 (0.1%) with newly diagnosed infections. Among the 135 HIV-positive patient-inmates, 134 (99%) were linked to care within 90 days of diagnosis, including 122 (91%) who initiated ART. Among 83 who initiated ART and remained incarcerated through July 2013, 81 (98%) continued ART; 71 (88%) achieved viral suppression (care within 30 days of release were virally suppressed at that time. Only one of nine persons with a viral load test conducted between 91 days and 1 year post-release had viral suppression. Although high rates of viral suppression were achieved in prison, continuity of care in the community remains a challenge. An infrastructure for post

  16. Analysis of source spectra, attenuation, and site effects from central and eastern United States earthquakes

    International Nuclear Information System (INIS)

    Lindley, G.

    1998-02-01

    This report describes the results from three studies of source spectra, attenuation, and site effects of central and eastern United States earthquakes. In the first study source parameter estimates taken from 27 previous studies were combined to test the assumption that the earthquake stress drop is roughly a constant, independent of earthquake size. 200 estimates of stress drop and seismic moment from eastern North American earthquakes were combined. It was found that the estimated stress drop from the 27 studies increases approximately as the square-root of the seismic moment, from about 3 bars at 10 20 dyne-cm to 690 bars at 10 25 dyne-cm. These results do not support the assumption of a constant stress drop when estimating ground motion parameters from eastern North American earthquakes. In the second study, broadband seismograms recorded by the United States National Seismograph Network and cooperating stations have been analysed to determine Q Lg as a function of frequency in five regions: the northeastern US, southeastern US, central US, northern Basin and Range, and California and western Nevada. In the third study, using spectral analysis, estimates have been made for the anelastic attenuation of four regional phases, and estimates have been made for the source parameters of 27 earthquakes, including the M b 5.6, 14 April, 1995, West Texas earthquake

  17. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    International Nuclear Information System (INIS)

    Wong, I.G.; Green, R.K.; Sun, J.I.; Pezzopane, S.K.; Abrahamson, N.A.; Quittmeyer, R.C.

    1996-01-01

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M w ) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M w 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M w 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper

  18. Simulation of scenario earthquake influenced field by using GIS

    Science.gov (United States)

    Zuo, Hui-Qiang; Xie, Li-Li; Borcherdt, R. D.

    1999-07-01

    The method for estimating the site effect on ground motion specified by Borcherdt (1994a, 1994b) is briefly introduced in the paper. This method and the detail geological data and site classification data in San Francisco bay area of California, the United States, are applied to simulate the influenced field of scenario earthquake by GIS technology, and the software for simulating has been drawn up. The paper is a partial result of cooperative research project between China Seismological Bureau and US Geological Survey.

  19. Predicting the impact of tsunami in California under rising sea level

    Science.gov (United States)

    Dura, T.; Garner, A. J.; Weiss, R.; Kopp, R. E.; Horton, B.

    2017-12-01

    The flood hazard for the California coast depends not only on the magnitude, location, and rupture length of Alaska-Aleutian subduction zone earthquakes and their resultant tsunamis, but also on rising sea levels, which combine with tsunamis to produce overall flood levels. The magnitude of future sea-level rise remains uncertain even on the decadal scale, with future sea-level projections becoming even more uncertain at timeframes of a century or more. Earthquake statistics indicate that timeframes of ten thousand to one hundred thousand years are needed to capture rare, very large earthquakes. Because of the different timescales between reliable sea-level projections and earthquake distributions, simply combining the different probabilities in the context of a tsunami hazard assessment may be flawed. Here, we considered 15 earthquakes between Mw 8 to Mw 9.4 bound by -171oW and -140oW of the Alaska-Aleutian subduction zone. We employed 24 realizations at each magnitude with random epicenter locations and different fault length-to-width ratios, and simulated the tsunami evolution from these 360 earthquakes at each decade from the years 2000 to 2200. These simulations were then carried out for different sea-level-rise projections to analyze the future flood hazard for California. Looking at the flood levels at tide gauges, we found that the flood level simulated at, for example, the year 2100 (including respective sea-level change) is different from the flood level calculated by adding the flood for the year 2000 to the sea-level change prediction for the year 2100. This is consistent for all sea-level rise scenarios, and this difference in flood levels range between 5% and 12% for the larger half of the given magnitude interval. Focusing on flood levels at the tide gauge in the Port of Los Angeles, the most probable flood level (including all earthquake magnitudes) in the year 2000 was 5 cm. Depending on the sea-level predictions, in the year 2050 the most probable

  20. Cloud-based systems for monitoring earthquakes and other environmental quantities

    Science.gov (United States)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  1. California Integrated Service Delivery Evaluation Report. Phase I

    Science.gov (United States)

    Moore, Richard W.; Rossy, Gerard; Roberts, William; Chapman, Kenneth; Sanchez, Urte; Hanley, Chris

    2010-01-01

    This study is a formative evaluation of the OneStop Career Center Integrated Service Delivery (ISD) Model within the California Workforce System. The study was sponsored by the California Workforce Investment Board. The study completed four in-depth case studies of California OneStops to describe how they implemented the ISD model which brings…

  2. University of Southern California

    Data.gov (United States)

    Federal Laboratory Consortium — The focus of the University of Southern California (USC) Children''s Environmental Health Center is to develop a better understanding of how host susceptibility and...

  3. Caltech/USGS Southern California Seismic Network: Recent Developments

    Science.gov (United States)

    Bhadha, R.; Chen, S.; Crummey, J.; Hauksson, E.; Solanki, K.; Thomas, V. I.; Watkins, M.; Yip, R.; Yu, E.; Given, D.; Peats, R.; Schwarz, S.

    2010-12-01

    The SCSN is the modern digital ground motion seismic network in Southern California and performs the following tasks: 1) Operates remote seismic stations and the central data processing systems in Pasadena; 2) Generates and reports real-time products including location, magnitude, ShakeMap, and others; 3) Responds to FEMA, CalEMA, media, and public inquiries about earthquakes; 4) Manages the production, archival, and distribution of waveforms, phase picks, and other data at the SCEDC; 5) Contributes to development and maintenance of the ANSS Quake Monitoring System (AQMS) software to add new features and improve robustness; 6) Supports the deployment of AQMS to other ANSS member regional seismic networks. The public regularly accesses the CISN, SCSN, and SCEDC web pages for up-to-date quake info and more than 230,000 users subscribe to the Electronic Notification System (ENS) which sends rapid notifications via email and cell phones. We distribute our products via Internet (EIDS), email, and paging, to USGS in Reston and Golden, FEMA, CalEMA, local governments, partner members, and other subscribers. We have developed CISN Display and provide ShakeCast for customers who require real-time earthquake information. The SCSN also exchanges waveform, phase pick, and amplitude data in real-time with several other partner networks, including Menlo Park, UCB, UNR, Anza network, the Tsunami Warning Centers, IRIS, and the NEIC. We operate a number of 24/7 on-call rotations to provide quick response to verify seismic events as well as addressing systems and telemetry issues. As part of our goals to improve quality, robustness, and coverage, some of our recent efforts include: 1) Converting the digital stations in the network to Q330 dataloggers; 2) Developing command and control capabilities such as automated mass re-centering; 3) Migration from serial to Ethernet communications; 4) Clustering of data acquisition servers for fail-over to improve data availability; 5) Use of

  4. How fault evolution changes strain partitioning and fault slip rates in Southern California: Results from geodynamic modeling

    Science.gov (United States)

    Ye, Jiyang; Liu, Mian

    2017-08-01

    In Southern California, the Pacific-North America relative plate motion is accommodated by the complex southern San Andreas Fault system that includes many young faults (faults and their impact on strain partitioning and fault slip rates are important for understanding the evolution of this plate boundary zone and assessing earthquake hazard in Southern California. Using a three-dimensional viscoelastoplastic finite element model, we have investigated how this plate boundary fault system has evolved to accommodate the relative plate motion in Southern California. Our results show that when the plate boundary faults are not optimally configured to accommodate the relative plate motion, strain is localized in places where new faults would initiate to improve the mechanical efficiency of the fault system. In particular, the Eastern California Shear Zone, the San Jacinto Fault, the Elsinore Fault, and the offshore dextral faults all developed in places of highly localized strain. These younger faults compensate for the reduced fault slip on the San Andreas Fault proper because of the Big Bend, a major restraining bend. The evolution of the fault system changes the apportionment of fault slip rates over time, which may explain some of the slip rate discrepancy between geological and geodetic measurements in Southern California. For the present fault configuration, our model predicts localized strain in western Transverse Ranges and along the dextral faults across the Mojave Desert, where numerous damaging earthquakes occurred in recent years.

  5. Seismicity and crustal structure at the Mendocino triple junction, Northern California

    Energy Technology Data Exchange (ETDEWEB)

    Dicke, M.

    1998-12-01

    A high level of seismicity at the Mendocino triple junction in Northern California reflects the complex active tectonics associated with the junction of the Pacific, North America, and Gorda plates. To investigate seismicity patterns and crustal structure, 6193 earthquakes recorded by the Northern California Seismic Network (NCSN) are relocated using a one-dimensional crustal velocity model. A near vertical truncation of the intense seismic activity offshore Cape Mendocino follows the strike of the Mattole Canyon fault and is interpreted to define the Pacific plate boundary. Seismicity along this boundary displays a double seismogenic layer that is attributed to interplate activity with the North America plate and Gorda plate. The interpretation of the shallow seismogenic zone as the North America - Pacific plate boundary implies that the Mendocino triple junction is situated offshore at present. Seismicity patterns and focal mechanisms for events located within the subducting Gorda pl ate are consistent with internal deformation on NE-SW and NW-SE trending rupture planes in response to north-south compression. Seismic sections indicate that the top of the Gorda plate locates at a depth of about 18 Km beneath Cape Mendocino and dips gently east-and southward. Earthquakes that are located in the Wadati-Benioff zone east of 236{sup o}E show a change to an extensional stress regime indicative of a slab pull force. This slab pull force and scattered seismicity within the contractional forearc region of the Cascadia subduction zone suggest that the subducting Gorda plate and the overriding North America plate are strongly coupled. The 1992 Cape Mendocino thrust earthquake is believed to have ruptured a blind thrust fault in the forearc region, suggesting that strain is accumulating that must ultimately be released in a potential M 8+ subduction earthquake.

  6. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  7. Filling a gap: Public talks about earthquake preparation and the 'Big One'

    Science.gov (United States)

    Reinen, L. A.

    2013-12-01

    Residents of southern California are aware they live in a seismically active area and earthquake drills have trained us to Duck-Cover-Hold On. While many of my acquaintance are familiar with what to do during an earthquake, few have made preparations for living with the aftermath of a large earthquake. The ShakeOut Scenario (Jones et al., USGS Open File Report 2008-1150) describes the physical, social, and economic consequences of a plausible M7.8 earthquake on the southernmost San Andreas Fault. While not detailing an actual event, the ShakeOut Scenario illustrates how individual and community preparation may improve the potential after-affects of a major earthquake in the region. To address the gap between earthquake drills and preparation in my community, for the past several years I have been giving public talks to promote understanding of: the science behind the earthquake predictions; why individual, as well as community, preparation is important; and, ways in which individuals can prepare their home and work environments. The public presentations occur in an array of venues, including elementary school and college classes, a community forum linked with the annual ShakeOut Drill, and local businesses including the local microbrewery. While based on the same fundamental information, each presentation is modified for audience and setting. Assessment of the impact of these talks is primarily anecdotal and includes an increase in the number of venues requesting these talks, repeat invitations, and comments from audience members (sometimes months or years after a talk). I will present elements of these talks, the background information used, and examples of how they have affected change in the earthquake preparedness of audience members. Discussion and suggestions (particularly about effective means of conducting rigorous long-term assessment) are strongly encouraged.

  8. Triggered surface slips in the Coachella Valley area associated with the 1992 Joshua Tree and Landers, California, Earthquakes

    Science.gov (United States)

    Rymer, M.J.

    2000-01-01

    The Coachella Valley area was strongly shaken by the 1992 Joshua Tree (23 April) and Landers (28 June) earthquakes, and both events caused triggered slip on active faults within the area. Triggered slip associated with the Joshua Tree earthquake was on a newly recognized fault, the East Wide Canyon fault, near the southwestern edge of the Little San Bernardino Mountains. Slip associated with the Landers earthquake formed along the San Andreas fault in the southeastern Coachella Valley. Surface fractures formed along the East Wide Canyon fault in association with the Joshua Tree earthquake. The fractures extended discontinuously over a 1.5-km stretch of the fault, near its southern end. Sense of slip was consistently right-oblique, west side down, similar to the long-term style of faulting. Measured offset values were small, with right-lateral and vertical components of slip ranging from 1 to 6 mm and 1 to 4 mm, respectively. This is the first documented historic slip on the East Wide Canyon fault, which was first mapped only months before the Joshua Tree earthquake. Surface slip associated with the Joshua Tree earthquake most likely developed as triggered slip given its 5 km distance from the Joshua Tree epicenter and aftershocks. As revealed in a trench investigation, slip formed in an area with only a thin (Salton Trough. A paleoseismic trench study in an area of 1992 surface slip revealed evidence of two and possibly three surface faulting events on the East Wide Canyon fault during the late Quaternary, probably latest Pleistocene (first event) and mid- to late Holocene (second two events). About two months after the Joshua Tree earthquake, the Landers earthquake then triggered slip on many faults, including the San Andreas fault in the southeastern Coachella Valley. Surface fractures associated with this event formed discontinuous breaks over a 54-km-long stretch of the fault, from the Indio Hills southeastward to Durmid Hill. Sense of slip was right

  9. Rapid Large Earthquake and Run-up Characterization in Quasi Real Time

    Science.gov (United States)

    Bravo, F. J.; Riquelme, S.; Koch, P.; Cararo, S.

    2017-12-01

    Several test in quasi real time have been conducted by the rapid response group at CSN (National Seismological Center) to characterize earthquakes in Real Time. These methods are known for its robustness and realibility to create Finite Fault Models. The W-phase FFM Inversion, The Wavelet Domain FFM and The Body Wave and FFM have been implemented in real time at CSN, all these algorithms are running automatically and triggered by the W-phase Point Source Inversion. Dimensions (Large and Width ) are predefined by adopting scaling laws for earthquakes in subduction zones. We tested the last four major earthquakes occurred in Chile using this scheme: The 2010 Mw 8.8 Maule Earthquake, The 2014 Mw 8.2 Iquique Earthquake, The 2015 Mw 8.3 Illapel Earthquake and The 7.6 Melinka Earthquake. We obtain many solutions as time elapses, for each one of those we calculate the run-up using an analytical formula. Our results are in agreements with some FFM already accepted by the sicentific comunnity aswell as run-up observations in the field.

  10. Mental Health of Survivors of the 2010 Haitian Earthquake Living in the United States

    Centers for Disease Control (CDC) Podcasts

    2010-04-16

    Thousands of survivors of the 2010 Haitian Earthquake are currently living in the United States. This podcast features a brief non-disease-specific interview with Dr. Marc Safran, CDC's longest serving psychiatrist, about a few of the mental health challenges such survivors may face.  Created: 4/16/2010 by CDC Center of Attribution: Mental and Behavioral Health Team, 2010 CDC Haiti Earthquake Mission, CDC Emergency Operations Center.   Date Released: 5/6/2010.

  11. Modeling of earthquake ground motion in the frequency domain

    Science.gov (United States)

    Thrainsson, Hjortur

    In recent years, the utilization of time histories of earthquake ground motion has grown considerably in the design and analysis of civil structures. It is very unlikely, however, that recordings of earthquake ground motion will be available for all sites and conditions of interest. Hence, there is a need for efficient methods for the simulation and spatial interpolation of earthquake ground motion. In addition to providing estimates of the ground motion at a site using data from adjacent recording stations, spatially interpolated ground motions can also be used in design and analysis of long-span structures, such as bridges and pipelines, where differential movement is important. The objective of this research is to develop a methodology for rapid generation of horizontal earthquake ground motion at any site for a given region, based on readily available source, path and site characteristics, or (sparse) recordings. The research includes two main topics: (i) the simulation of earthquake ground motion at a given site, and (ii) the spatial interpolation of earthquake ground motion. In topic (i), models are developed to simulate acceleration time histories using the inverse discrete Fourier transform. The Fourier phase differences, defined as the difference in phase angle between adjacent frequency components, are simulated conditional on the Fourier amplitude. Uniformly processed recordings from recent California earthquakes are used to validate the simulation models, as well as to develop prediction formulas for the model parameters. The models developed in this research provide rapid simulation of earthquake ground motion over a wide range of magnitudes and distances, but they are not intended to replace more robust geophysical models. In topic (ii), a model is developed in which Fourier amplitudes and Fourier phase angles are interpolated separately. A simple dispersion relationship is included in the phase angle interpolation. The accuracy of the interpolation

  12. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  13. Economic impacts of the SAFRR tsunami scenario in California: Chapter H in The SAFRR (Science Application for Risk Reduction) Tsunami Scenario

    Science.gov (United States)

    Wein, Anne; Rose, Adam; Sue Wing, Ian; Wei, Dan

    2013-01-01

    This study evaluates the hypothetical economic impacts of the SAFRR (Science Application for Risk Reduction) tsunami scenario to the California economy. The SAFRR scenario simulates a tsunami generated by a hypothetical magnitude 9.1 earthquake that occurs offshore of the Alaska Peninsula (Kirby and others, 2013). Economic impacts are measured by the estimated reduction in California’s gross domestic product (GDP), the standard economic measure of the total value of goods and services produced. Economic impacts are derived from the physical damages from the tsunami as described by Porter and others (2013). The principal physical damages that result in disruption of the California economy are (1) about $100 million in damages to the twin Ports of Los Angeles (POLA) and Long Beach (POLB), (2) about $700 million in damages to marinas, and (3) about $2.5 billion in damages to buildings and contents (properties) in the tsunami inundation zone on the California coast. The study of economic impacts does not include the impacts from damages to roads, bridges, railroads, and agricultural production or fires in fuel storage facilities because these damages will be minimal with respect to the California economy. The economic impacts of damage to other California ports are not included in this study because detailed evaluation of the physical damage to these ports was not available in time for this report. The analysis of economic impacts is accomplished in several steps. First, estimates are made for the direct economic impacts that result in immediate business interruption losses in individual sectors of the economy due to physical damage to facilities or to disruption of the flow of production units (commodities necessary for production). Second, the total economic impacts (consisting of both direct and indirect effects) are measured by including the general equilibrium (essentially quantity and price multiplier effects) of lost production in other sectors by ripple

  14. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC

    Science.gov (United States)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.

    2017-12-01

    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  15. SEISMIC PICTURE OF A FAULT ZONE. WHAT CAN BE GAINED FROM THE ANALYSIS OF FINE PATTERNS OF SPATIAL DISTRIBUTION OF WEAK EARTHQUAKE CENTERS?

    Directory of Open Access Journals (Sweden)

    Gevorg G. Kocharyan

    2010-01-01

    Full Text Available Association of earthquake hypocenters with fault zones appears more pronounced in cases with more accurately determined positions of the earthquakes. For complex, branched structures of major fault zones, it is assumed that some of the earthquakes occur at feathering fractures of smaller scale.It is thus possible to develop a «seismological» criterion for definition of a zone of dynamic influence of faults, i.e. the zone containing the majority of earthquakes associated with the fault zone under consideration.In this publication, seismogenic structures of several fault zones located in the San-Andreas fault system are reviewed. Based on the data from a very dense network of digital seismic stations installed in this region and with application of modern data processing methods, differential coordinates of microearthquakes can be determined with errors of about first dozens of meters.It is thus possible to precisely detect boundaries of the areas wherein active deformation processes occur and to reveal spatial patterns of seismic event localization.In our analyses, data from the most comprehensive seismic catalog were used. The catalogue includes information on events which occurred and were registered in North California in the period between January 1984 and May 2003. In this publication, the seismic data processing results and regularities revealed during the analyses are compared with the data obtained from studies of fault structures, modeling and numerical simulation results. Results of quantitative research of regularities of localization of seismic sources inside fault zones are presented.It is demonstrated by 3D models that seismic events are localized in the vicinity of an almost plain surface with a nearly constant angle of dip, the majority of events being concentrated at that conventional surface.Detection of typical scopes of seismicity localization may prove critical for solution of problems of technogenic impact on fault zones

  16. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016

    Science.gov (United States)

    Detweiler, Shane; Pollitz, Fred

    2017-10-18

    The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting.

  17. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  18. Quaternary Slip History for the Agua Blanca Fault, northern Baja California, Mexico

    Science.gov (United States)

    Gold, P. O.; Behr, W. M.; Rockwell, T. K.; Fletcher, J. M.

    2017-12-01

    The Agua Blanca Fault (ABF) is the primary structure accommodating San Andreas-related right-lateral slip across the Peninsular Ranges of northern Baja California. Activity on this fault influences offshore faults that parallel the Pacific coast from Ensenada to Los Angeles and is a potential threat to communities in northern Mexico and southern California. We present a detailed Quaternary slip history for the ABF, including new quantitative constraints on geologic slip rates, slip-per-event, the timing of most recent earthquake, and the earthquake recurrence interval. Cosmogenic 10Be exposure dating of clasts from offset fluvial geomorphic surfaces at 2 sites located along the western, and most active, section of the ABF yield preliminary slip rate estimates of 2-4 mm/yr and 3 mm/yr since 20 ka and 2 ka, respectively. Fault zone geomorphology preserved at the younger site provides evidence for right-lateral surface displacements measuring 2.5 m in the past two ruptures. Luminescence dating of an offset alluvial fan at a third site is in progress, but is expected to yield a slip rate relevant to the past 10 kyr. Adjacent to this third site, we excavated 2 paleoseismic trenches across a sag pond formed by a right step in the fault. Preliminary radiocarbon dates indicate that the 4 surface ruptures identified in the trenches occurred in the past 6 kyr, although additional dating should clarify earthquake timing and the mid-Holocene to present earthquake recurrence interval, as well as the likely date of the most recent earthquake. Our new slip rate estimates are somewhat lower than, but comparable within error to, previous geologic estimates based on soil morphology and geodetic estimates from GPS, but the new record of surface ruptures exposed in the trenches is the most complete and comprehensively dated earthquake history yet determined for this fault. Together with new and existing mapping of tectonically generated geomorphology along the ABF, our constraints

  19. Estimating annualized earthquake losses for the conterminous United States

    Science.gov (United States)

    Jaiswal, Kishor S.; Bausch, Douglas; Chen, Rui; Bouabid, Jawhar; Seligson, Hope

    2015-01-01

    We make use of the most recent National Seismic Hazard Maps (the years 2008 and 2014 cycles), updated census data on population, and economic exposure estimates of general building stock to quantify annualized earthquake loss (AEL) for the conterminous United States. The AEL analyses were performed using the Federal Emergency Management Agency's (FEMA) Hazus software, which facilitated a systematic comparison of the influence of the 2014 National Seismic Hazard Maps in terms of annualized loss estimates in different parts of the country. The losses from an individual earthquake could easily exceed many tens of billions of dollars, and the long-term averaged value of losses from all earthquakes within the conterminous U.S. has been estimated to be a few billion dollars per year. This study estimated nationwide losses to be approximately $4.5 billion per year (in 2012$), roughly 80% of which can be attributed to the States of California, Oregon and Washington. We document the change in estimated AELs arising solely from the change in the assumed hazard map. The change from the 2008 map to the 2014 map results in a 10 to 20% reduction in AELs for the highly seismic States of the Western United States, whereas the reduction is even more significant for Central and Eastern United States.

  20. California Ocean Uses Atlas

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  1. Center for Adaptive Optics | Center

    Science.gov (United States)

    Astronomy, UCSC's CfAO and ISEE, and Maui Community College, runs education and internship programs in / Jacobs Retina Center Department of Psychology University of California, San Francisco Department of University School of Optometry Maui Community College Maui Community College Space Grant Program Montana

  2. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  3. GPS-seismograms reveal amplified shaking in California's San Joaquin Delta region

    Science.gov (United States)

    Johanson, I. A.

    2014-12-01

    The March 10, 2014, the Mw6.8 Ferndale earthquake occurred off the coast of Northern California, near the Mendocino Triple Junction. Aftershocks suggest a northeast striking fault plane for the strike-slip earthquake, oriented such that the California coast is roughly perpendicular to the rupture plane. Consequently, large amplitude Love waves were observed at seismic stations and continuous GPS stations throughout Northern California. While GPS is less sensitive then broadband instruments, in Northern California their station density is much higher, potentially providing valuable detail. A total of 269 GPS stations that have high-rate (1 sps) data available were used to generate GPS-seismograms. These include stations from the Bay Area Regional Deformation (BARD) network, the Plate Boundary Observatory (PBO, operated by UNAVCO), and the USGS, Menlo Park. The Track software package was used to generate relative displacements between pairs of stations, determined using Delaunay triangulation. This network-based approach allows for higher precision than absolute positioning, because common noise sources, in particular atmospheric noise, are cancelled out. A simple least-squares network adjustment with a stable centroid constraint is performed to transform the mesh of relative motions into absolute motions at individual GPS stations. This approach to generating GPS-seismograms is validated by the good agreement between time series records at 16 BARD stations that are co-located with broadband seismometers from the Berkeley Digital Seismic Network (BDSN). While the distribution of peak dynamic displacements is dominated in long periods by the radiation pattern, at shorter periods other patterns become visible. In particular, stations in the San Joaquin Delta (SJD) region show higher peak dynamic displacements than those in surrounding areas, as well as longer duration shaking. SJD stations also have higher dynamic displacements on the radial component than surrounding

  4. Seismological investigation of earthquakes in the New Madrid Seismic Zone

    International Nuclear Information System (INIS)

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35 degrees--39 degrees N and longitudes 87 degrees--92 degrees W. Most of these earthquakes occur within a 1.5 degrees x 2 degrees zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions

  5. Three-dimensional crustal structure for the Mendocino Triple Junction region from local earthquake travel times

    Energy Technology Data Exchange (ETDEWEB)

    Verdonck, D.; Zandt, G. [Lawrence Livermore National Lab., CA (United States)

    1994-12-10

    The large-scale, three-dimensional geometry of the Mendocino Triple Junction at Cape Mendocino, California, was investigated by inverting nearly 19,000 P wave arrival times from over 1400 local earthquakes to estimate the three-dimensional velocity structure and hypocentral parameters. A velocity grid 175 km (N-S) by 125 km (E-W) centered near Garberville, California, was constructed with 25 km horizontal and 5 km vertical node spacing. The model was well resolved near Cape Mendocino, where the earthquakes and stations are concentrated. At about 40.6{degrees}N latitude a high-velocity gradient between 6.5 and 7.5 km/s dips gently to the south and east from about 15 km depth near the coast. Relocated hypocenters concentrate below this high gradient which the authors interpret as the oceanic crust of the subducted Gorda Plate. Therefore the depth to the top of the Gorda Plate near Cape Mendocino is interpreted to be {approximately} 15 km. The Gorda Plate appears intact and dipping {approximately}8{degrees} eastward due to subduction and flexing downward 6{degrees}-12{degrees} to the south. Both hypocenters and velocity structure suggest that the southern edge of the plate intersects the coastline at 40.3{degrees}N latitude and maintains a linear trend 15{degrees} south of east to at least 123{degrees}W longitude. The top of a large low-velocity region at 20-30 km depth extends about 50 km N-S and 75 km E-W (roughly between Garberville and Covelo) and is located above and south of the southern edge of the Gorda Plate. The authors interpret this low velocity area to be locally thickened crust (8-10 km) due to either local compressional forces associated with north-south compression caused by the northward impingement of the rigid Pacific Plate or by underthrusting of the base of the accretionary subduction complex at the southern terminous of the Cascadia Subduction Zone. 66 refs., 11 figs., 3 tabs.

  6. Rapid Source Characterization of the 2011 Mw 9.0 off the Pacific coast of Tohoku Earthquake

    Science.gov (United States)

    Hayes, Gavin P.

    2011-01-01

    On March 11th, 2011, a moment magnitude 9.0 earthquake struck off the coast of northeast Honshu, Japan, generating what may well turn out to be the most costly natural disaster ever. In the hours following the event, the U.S. Geological Survey National Earthquake Information Center led a rapid response to characterize the earthquake in terms of its location, size, faulting source, shaking and slip distributions, and population exposure, in order to place the disaster in a framework necessary for timely humanitarian response. As part of this effort, fast finite-fault inversions using globally distributed body- and surface-wave data were used to estimate the slip distribution of the earthquake rupture. Models generated within 7 hours of the earthquake origin time indicated that the event ruptured a fault up to 300 km long, roughly centered on the earthquake hypocenter, and involved peak slips of 20 m or more. Updates since this preliminary solution improve the details of this inversion solution and thus our understanding of the rupture process. However, significant observations such as the up-dip nature of rupture propagation and the along-strike length of faulting did not significantly change, demonstrating the usefulness of rapid source characterization for understanding the first order characteristics of major earthquakes.

  7. A local earthquake coda magnitude and its relation to duration, moment M sub O, and local Richter magnitude M sub L

    Science.gov (United States)

    Suteau, A. M.; Whitcomb, J. H.

    1977-01-01

    A relationship was found between the seismic moment, M sub O, of shallow local earthquakes and the total duration of the signal, t, in seconds, measured from the earthquakes origin time, assuming that the end of the coda is composed of backscattering surface waves due to lateral heterogenity in the shallow crust following Aki. Using the linear relationship between the logarithm of M sub O and the local Richter magnitude M sub L, a relationship between M sub L and t, was found. This relationship was used to calculate a coda magnitude M sub C which was compared to M sub L for Southern California earthquakes which occurred during the period from 1972 to 1975.

  8. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  9. Statistical short-term earthquake prediction.

    Science.gov (United States)

    Kagan, Y Y; Knopoff, L

    1987-06-19

    A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

  10. Coulomb stress interactions among M≥5.9 earthquakes in the Gorda deformation zone and on the Mendocino Fracture Zone, Cascadia megathrust, and northern San Andreas fault

    Science.gov (United States)

    Rollins, John C.; Stein, Ross S.

    2010-01-01

    The Gorda deformation zone, a 50,000 km2 area of diffuse shear and rotation offshore northernmost California, has been the site of 20 M ≥ 5.9 earthquakes on four different fault orientations since 1976, including four M ≥ 7 shocks. This is the highest rate of large earthquakes in the contiguous United States. We calculate that the source faults of six recent M ≥ 5.9 earthquakes had experienced ≥0.6 bar Coulomb stress increases imparted by earthquakes that struck less than 9 months beforehand. Control tests indicate that ≥0.6 bar Coulomb stress interactions between M ≥ 5.9 earthquakes separated by Mw = 7.3 Trinidad earthquake are consistent with the locations of M ≥ 5.9 earthquakes in the Gorda zone until at least 1995, as well as earthquakes on the Mendocino Fault Zone in 1994 and 2000. Coulomb stress changes imparted by the 1980 earthquake are also consistent with its distinct elbow-shaped aftershock pattern. From these observations, we derive generalized static stress interactions among right-lateral, left-lateral and thrust faults near triple junctions.

  11. Tokai earthquakes and Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    Komura, Hiroo

    1981-01-01

    Kanto district and Shizuoka Prefecture are designated as ''Observation strengthening districts'', where the possibility of earthquake occurrence is high. Hamaoka Nuclear Power Station, Chubu Electric Power Co., Inc., is at the center of this district. Nuclear power stations are vulnerable to earthquakes, and if damages are caused by earthquakes in nuclear power plants, the most dreadful accidents may occur. The Chubu Electric Power Co. underestimates the possibility and scale of earthquakes and the estimate of damages, and has kept on talking that the rock bed of the power station site is strong, and there is not the fear of accidents. However the actual situation is totally different from this. The description about earthquakes and the rock bed in the application of the installation of No.3 plant was totally rewritten after two years safety examination, and the Ministry of International Trade and Industry approved the application in less than two weeks thereafter. The rock bed is geologically evaluated in this paper, and many doubtful points in the application are pointed out. In addition, there are eight active faults near the power station site. The aseismatic design of the Hamaoka Nuclear Power Station assumes the acceleration up to 400 gal, but it may not be enough. The Hamaoka Nuclear Power Station is intentionally neglected in the estimate of damages in Shizuoka Prefecture. (Kako, I.)

  12. Correlation of pre-earthquake electromagnetic signals with laboratory and field rock experiments

    Directory of Open Access Journals (Sweden)

    T. Bleier

    2010-09-01

    Full Text Available Analysis of the 2007 M5.4 Alum Rock earthquake near San José California showed that magnetic pulsations were present in large numbers and with significant amplitudes during the 2 week period leading up the event. These pulsations were 1–30 s in duration, had unusual polarities (many with only positive or only negative polarities versus both polarities, and were different than other pulsations observed over 2 years of data in that the pulse sequence was sustained over a 2 week period prior to the quake, and then disappeared shortly after the quake. A search for the underlying physics process that might explain these pulses was was undertaken, and one theory (Freund, 2002 demonstrated that charge carriers were released when various types of rocks were stressed in a laboratory environment. It was also significant that the observed charge carrier generation was transient, and resulted in pulsating current patterns. In an attempt to determine if this phenomenon occurred outside of the laboratory environment, the authors scaled up the physics experiment from a relatively small rock sample in a dry laboratory setting, to a large 7 metric tonne boulder comprised of Yosemite granite. This boulder was located in a natural, humid (above ground setting at Bass Lake, Ca. The boulder was instrumented with two Zonge Engineering, Model ANT4 induction type magnetometers, two Trifield Air Ion Counters, a surface charge detector, a geophone, a Bruker Model EM27 Fourier Transform Infra Red (FTIR spectrometer with Sterling cycle cooler, and various temperature sensors. The boulder was stressed over about 8 h using expanding concrete (Bustartm, until it fractured into three major pieces. The recorded data showed surface charge build up, magnetic pulsations, impulsive air conductivity changes, and acoustical cues starting about 5 h before the boulder actually broke. These magnetic and air conductivity pulse signatures resembled both the laboratory

  13. Volcanic unrest and hazard communication in Long Valley Volcanic Region, California

    Science.gov (United States)

    Hill, David P.; Mangan, Margaret T.; McNutt, Stephen R.

    2017-01-01

    The onset of volcanic unrest in Long Valley Caldera, California, in 1980 and the subsequent fluctuations in unrest levels through May 2016 illustrate: (1) the evolving relations between scientists monitoring the unrest and studying the underlying tectonic/magmatic processes and their implications for geologic hazards, and (2) the challenges in communicating the significance of the hazards to the public and civil authorities in a mountain resort setting. Circumstances special to this case include (1) the sensitivity of an isolated resort area to media hype of potential high-impact volcanic and earthquake hazards and its impact on potential recreational visitors and the local economy, (2) a small permanent population (~8000), which facilitates face-to-face communication between scientists monitoring the hazard, civil authorities, and the public, and (3) the relatively frequent turnover of people in positions of civil authority, which requires a continuing education effort on the nature of caldera unrest and related hazards. Because of delays associated with communication protocols between the State and Federal governments during the onset of unrest, local civil authorities and the public first learned that the U.S. Geological Survey was about to release a notice of potential volcanic hazards associated with earthquake activity and 25-cm uplift of the resurgent dome in the center of the caldera through an article in the Los Angeles Times published in May 1982. The immediate reaction was outrage and denial. Gradual acceptance that the hazard was real required over a decade of frequent meetings between scientists and civil authorities together with public presentations underscored by frequently felt earthquakes and the onset of magmatic CO2 emissions in 1990 following a 11-month long earthquake swarm beneath Mammoth Mountain on the southwest rim of the caldera. Four fatalities, one on 24 May 1998 and three on 6 April 2006, underscored the hazard posed by the CO2

  14. Spatial distribution of earthquake hypocenters in the Crimea—Black Sea region

    Science.gov (United States)

    Burmin, V. Yu; Shumlianska, L. O.

    2018-03-01

    Some aspects of the seismicity the Crime—Black Sea region are considered on the basis of the catalogued data on earthquakes that have occurred between 1970 and 2012. The complete list of the Crimean earthquakes for this period contains about 2140 events with magnitude ranging from -1.5 to 5.5. Bulletins contain information about compressional and shear waves arrival times regarding nearly 2000 earthquakes. A new approach to the definition of the coordinates of all of the events was applied to re-establish the hypocenters of the catalogued earthquakes. The obtained results indicate that the bulk of the earthquakes' foci in the region are located in the crust. However, some 2.5% of the foci are located at the depths ranging from 50 to 250 km. The new distribution of foci of earthquakes shows the concentration of foci in the form of two inclined branches, the center of which is located under the Yalto-Alushta seismic focal zone. The whole distribution of foci in depth corresponds to the relief of the lithosphere.

  15. Selected Images of the Effects of the October 15, 2006, Kiholo Bay-Mahukona, Hawai'i, Earthquakes and Recovery Efforts

    Science.gov (United States)

    Takahashi, Taeko Jane; Ikeda, Nancy A.; Okubo, Paul G.; Sako, Maurice K.; Dow, David C.; Priester, Anna M.; Steiner, Nolan A.

    2011-01-01

    Early on the morning of October 15, 2006, two moderate earthquakes—the largest in decades—struck the Island of Hawai‘i. The first of these, which occurred at 7:07 a.m., HST (1707 UTC), was a magnitude (M) 6.7 earthquake, centered beneath Kīholo Bay on the northwestern coast of the island (19.878°N, 155.935°W), at a depth of 39 km. The second earthquake, which struck 6 minutes, 24 seconds later, at 7:14 a.m., HST (1714 UTC), was located 28 km to the north-northwest of Kīholo Bay (20.129°N, 155.983°W), centered at a depth of 19 km. This M6.0 earthquake has since been referred to as the Māhukona earthquake. Losses from the combined effects of these earthquakes are estimated to be $200 million—the most costly events, by far, in Hawai‘i’s earthquake history.

  16. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    Science.gov (United States)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  17. Strong Motion Instrumentation of Seismically-Strengthened Port Structures in California by CSMIP

    Science.gov (United States)

    Huang, M.J.; Shakal, A.F.

    2009-01-01

    The California Strong Motion Instrumentation Program (CSMIP) has instrumented five port structures. Instrumentation of two more port structures is underway and another one is in planning. Two of the port structures have been seismically strengthened. The primary goals of the strong motion instrumentation are to obtain strong earthquake shaking data for verifying seismic analysis procedures and strengthening schemes, and for post-earthquake evaluations of port structures. The wharves instrumented by CSMIP were recommended by the Strong Motion Instrumentation Advisory Committee, a committee of the California Seismic Safety Commission. Extensive instrumentation of a wharf is difficult and would be impossible without the cooperation of the owners and the involvement of the design engineers. The instrumentation plan for a wharf is developed through study of the retrofit plans of the wharf, and the strong-motion sensors are installed at locations where specific instrumentation objectives can be achieved and access is possible. Some sensor locations have to be planned during design; otherwise they are not possible to install after construction. This paper summarizes the two seismically-strengthened wharves and discusses the instrumentation schemes and objectives. ?? 2009 ASCE.

  18. USGS response to an urban earthquake, Northridge '94

    Science.gov (United States)

    Updike, Randall G.; Brown, William M.; Johnson, Margo L.; Omdahl, Eleanor M.; Powers, Philip S.; Rhea, Susan; Tarr, Arthur C.

    1996-01-01

    The urban centers of our Nation provide our people with seemingly unlimited employment, social, and cultural opportunities as a result of the complex interactions of a diverse population embedded in an highly-engineered environment. Catastrophic events in one or more of the natural earth systems which underlie or envelop urban environment can have radical effects on the integrity and survivability of that environment. Earthquakes have for centuries been the source of cataclysmic events on cities throughout the world. Unlike many other earth processes, the effects of major earthquakes transcend all political, social, and geomorphic boundaries and can have decided impact on cities tens to hundreds of kilometers from the epicenter. In modern cities, where buildings, transportation corridors, and lifelines are complexly interrelated, the life, economic, and social vulnerabilities in the face of a major earthquake can be particularly acute.

  19. Satellite Infrared Radiation Measurements Prior to the Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulintes, S.; Bryant, N.; Taylor, Patrick; Freund, F.

    2005-01-01

    This work describes our search for a relationship between tectonic stresses and increases in mid-infrared (IR) flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. We present and &scuss observed variations in thermal transients and radiation fields prior to the earthquakes of Jan 22, 2003 Colima (M6.7) Mexico, Sept. 28 .2004 near Parkfield (M6.0) in California and Northern Sumatra (M8.5) Dec. 26,2004. Previous analysis of earthquake events has indicated the presence of an IR anomaly, where temperatures increased or did not return to its usual nighttime value. Our procedures analyze nighttime satellite data that records the general condtion of the ground after sunset. We have found from the MODIS instrument data that five days before the Colima earthquake the IR land surface nighttime temperature rose up to +4 degrees C in a 100 km radius around the epicenter. The IR transient field recorded by MODIS in the vicinity of Parkfield, also with a cloud free environment, was around +1 degree C and is significantly smaller than the IR anomaly around the Colima epicenter. Ground surface temperatures near the Parkfield epicenter four days prior to the earthquake show steady increase. However, on the night preceding the quake, a significant drop in relative humidity was indicated, process similar to those register prior to the Colima event. Recent analyses of continuous ongoing long- wavelength Earth radiation (OLR) indicate significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and/or gas composition prior to the earthquake. The OLR anomaly usually covers large areas surrounding the main epicenter. We have found strong anomalies signal (two sigma) along the epicentral area signals on Dec 21

  20. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  1. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  2. Lithospheric flexure under the Hawaiian volcanic load: Internal stresses and a broken plate revealed by earthquakes

    Science.gov (United States)

    Klein, Fred W.

    2016-01-01

    Several lines of earthquake evidence indicate that the lithospheric plate is broken under the load of the island of Hawai`i, where the geometry of the lithosphere is circular with a central depression. The plate bends concave downward surrounding a stress-free hole, rather than bending concave upward as with past assumptions. Earthquake focal mechanisms show that the center of load stress and the weak hole is between the summits of Mauna Loa and Mauna Kea where the load is greatest. The earthquake gap at 21 km depth coincides with the predicted neutral plane of flexure where horizontal stress changes sign. Focal mechanism P axes below the neutral plane display a striking radial pattern pointing to the stress center. Earthquakes above the neutral plane in the north part of the island have opposite stress patterns; T axes tend to be radial. The M6.2 Honomu and M6.7 Kiholo main shocks (both at 39 km depth) are below the neutral plane and show radial compression, and the M6.0 Kiholo aftershock above the neutral plane has tangential compression. Earthquakes deeper than 20 km define a donut of seismicity around the stress center where flexural bending is a maximum. The hole is interpreted as the soft center where the lithospheric plate is broken. Kilauea's deep conduit is seismically active because it is in the ring of maximum bending. A simplified two-dimensional stress model for a bending slab with a load at one end yields stress orientations that agree with earthquake stress axes and radial P axes below the neutral plane. A previous inversion of deep Hawaiian focal mechanisms found a circular solution around the stress center that agrees with the model. For horizontal faults, the shear stress within the bending slab matches the slip in the deep Kilauea seismic zone and enhances outward slip of active flanks.

  3. Finite-Source Inversion for the 2004 Parkfield Earthquake using 3D Velocity Model Green's Functions

    Science.gov (United States)

    Kim, A.; Dreger, D.; Larsen, S.

    2008-12-01

    .25 Hz but that the velocity model is fast at stations located very close to the fault. In this near-fault zone the model also underpredicts the amplitudes. This implies the need to include an additional low velocity zone in the fault zone to fit the data. For the finite fault modeling we use the same stations as in our previous study (Kim and Dreger 2008), and compare the results to investigate the effect of 3D Green's functions on kinematic source inversions. References: Brocher, T. M., (2005), Empirical relations between elastic wavespeeds and density in the Earth's crust, Bull. Seism. Soc. Am., 95, No. 6, 2081-2092. Eberhart-Phillips, D., and A.J. Michael, (1993), Three-dimensional velocity structure and seismicity in the Parkfield region, central California, J. Geophys. Res., 98, 15,737-15,758. Kim A., D. S. Dreger (2008), Rupture process of the 2004 Parkfield earthquake from near-fault seismic waveform and geodetic records, J. Geophys. Res., 113, B07308. Thurber, C., H. Zhang, F. Waldhauser, J. Hardebeck, A. Michaels, and D. Eberhart-Phillips (2006), Three- dimensional compressional wavespeed model, earthquake relocations, and focal mechanisms for the Parkfield, California, region, Bull. Seism. Soc. Am., 96, S38-S49. Larsen, S., and C. A. Schultz (1995), ELAS3D: 2D/3D elastic finite-difference wave propagation code, Technical Report No. UCRL-MA-121792, 19pp. Liu, P., and R. J. Archuleta (2004), A new nonlinear finite fault inversion with three-dimensional Green's functions: Application to the 1989 Loma Prieta, California, earthquake, J. Geophys. Res., 109, B02318.

  4. The Temblor mobile seismic risk app, v2: Rapid and seamless earthquake information to inspire individuals to recognize and reduce their risk

    Science.gov (United States)

    Stein, R. S.; Sevilgen, V.; Sevilgen, S.; Kim, A.; Jacobson, D. S.; Lotto, G. C.; Ely, G.; Bhattacharjee, G.; O'Sullivan, J.

    2017-12-01

    Temblor quantifies and personalizes earthquake risk and offers solutions by connecting users with qualified retrofit and insurance providers. Temblor's daily blog on current earthquakes, seismic swarms, eruptions, floods, and landslides makes the science accessible to the public. Temblor is available on iPhone, Android, and mobile web app platforms (http://temblor.net). The app presents both scenario (worst case) and probabilistic (most likely) financial losses for homes and commercial buildings, and estimates the impact of seismic retrofit and insurance on the losses and safety. Temblor's map interface has clickable earthquakes (with source parameters and links) and active faults (name, type, and slip rate) around the world, and layers for liquefaction, landslides, tsunami inundation, and flood zones in the U.S. The app draws from the 2014 USGS National Seismic Hazard Model and the 2014 USGS Building Seismic Safety Council ShakeMap scenari0 database. The Global Earthquake Activity Rate (GEAR) model is used worldwide, with active faults displayed in 75 countries. The Temblor real-time global catalog is merged from global and national catalogs, with aftershocks discriminated from mainshocks. Earthquake notifications are issued to Temblor users within 30 seconds of their occurrence, with approximate locations and magnitudes that are rapidly refined in the ensuing minutes. Launched in 2015, Temblor has 650,000 unique users, including 250,000 in the U.S. and 110,000 in Chile, as well as 52,000 Facebook followers. All data shown in Temblor is gathered from authoritative or published sources and is synthesized to be intuitive and actionable to the public. Principal data sources include USGS, FEMA, EMSC, GEM Foundation, NOAA, GNS Science (New Zealand), INGV (Italy), PHIVOLCS (Philippines), GSJ (Japan), Taiwan Earthquake Model, EOS Singapore (Southeast Asia), MTA (Turkey), PB2003 (plate boundaries), CICESE (Baja California), California Geological Survey, and 20 other state

  5. JGR special issue on Deep Earthquakes

    Science.gov (United States)

    The editor and associate editors of the Journal of Geophysical Research—Solid Earth and Planets invite the submission of manuscripts for a special issue on the topic “Deep- and Intermediate-Focus Earthquakes, Phase Transitions, and the Mechanics of Deep Subduction.”Manuscripts should be submitted to JGR Editor Gerald Schubert (Department of Earth and Space Sciences, University of California, Los Angeles, Los Angeles, CA 90024) before July 1, 1986, in accordance with the usual rules for manuscript submission. Submitted papers will undergo the normal JGR review procedure. For more information, contact either Schubert or the special guest associate editor, Cliff Frohlich (Institute for Geophysics, University of Texas at Austin, 4920 North IH-35, Austin, TX 78751; telephone: 512-451-6223).

  6. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  7. Evolution in the lineament patterns associated to strong earthquakes revealed by satellite observations

    Science.gov (United States)

    Soto-Pinto, C. A.; Arellano-Baeza, A. A.; Ouzounov, D. P.

    2011-12-01

    We study the temporal evolution of the stress patterns in the crust by using high-resolution (10-300 m) satellite images from MODIS and ASTER satellite sensors. We are able to detect some changes in density and orientation of lineaments preceding earthquake events. A lineament is generally defined as a straight or a somewhat curved feature in the landscape visible in a satellite image as an aligned sequence of pixels of a contrasting intensity compared to the background. The system of lineaments extracted from the satellite images is not identical to the geological lineaments; nevertheless, it generally reflects the structure of the faults and fractures in the Earth's crust. Our analysis has shown that the system of lineaments is very dynamical, and the significant number of lineaments appeared approximately one month before an earthquake, while one month after the earthquake the lineament configuration returned to its initial state. These features were not observed in the test areas that are free of any seismic activity in that period (null hypothesis). We have designed a computational prototype capable to detect lineament evolution and to utilize both ASTER and MODIS satellite L1/L2. We will demonstrate the first successful test results for several Mw> 5 earthquakes in Chile, Peru, China, and California (USA).

  8. Blind identification of the Millikan Library from earthquake data considering soil–structure interaction

    Science.gov (United States)

    Ghahari, S. F.; Abazarsa, F.; Avci, O.; Çelebi, Mehmet; Taciroglu, E.

    2016-01-01

    The Robert A. Millikan Library is a reinforced concrete building with a basement level and nine stories above the ground. Located on the campus of California Institute of Technology (Caltech) in Pasadena California, it is among the most densely instrumented buildings in the U.S. From the early dates of its construction, it has been the subject of many investigations, especially regarding soil–structure interaction effects. It is well accepted that the structure is significantly interacting with the surrounding soil, which implies that the true foundation input motions cannot be directly recorded during earthquakes because of inertial effects. Based on this limitation, input–output modal identification methods are not applicable to this soil–structure system. On the other hand, conventional output-only methods are typically based on the unknown input signals to be stationary whitenoise, which is not the case for earthquake excitations. Through the use of recently developed blind identification (i.e. output-only) methods, it has become possible to extract such information from only the response signals because of earthquake excitations. In the present study, we employ such a blind identification method to extract the modal properties of the Millikan Library. We present some modes that have not been identified from force vibration tests in several studies to date. Then, to quantify the contribution of soil–structure interaction effects, we first create a detailed Finite Element (FE) model using available information about the superstructure; and subsequently update the soil–foundation system's dynamic stiffnesses at each mode such that the modal properties of the entire soil–structure system agree well with those obtained via output-only modal identification.

  9. A preliminary assessment of earthquake ground shaking hazard at Yucca Mountain, Nevada and implications to the Las Vegas region

    Energy Technology Data Exchange (ETDEWEB)

    Wong, I.G.; Green, R.K.; Sun, J.I. [Woodward-Clyde Federal Services, Oakland, CA (United States); Pezzopane, S.K. [Geological Survey, Denver, CO (United States); Abrahamson, N.A. [Abrahamson (Norm A.), Piedmont, CA (United States); Quittmeyer, R.C. [Woodward-Clyde Federal Services, Las Vegas, NV (United States)

    1996-12-31

    As part of early design studies for the potential Yucca Mountain nuclear waste repository, the authors have performed a preliminary probabilistic seismic hazard analysis of ground shaking. A total of 88 Quaternary faults within 100 km of the site were considered in the hazard analysis. They were characterized in terms of their probability o being seismogenic, and their geometry, maximum earthquake magnitude, recurrence model, and slip rate. Individual faults were characterized by maximum earthquakes that ranged from moment magnitude (M{sub w}) 5.1 to 7.6. Fault slip rates ranged from a very low 0.00001 mm/yr to as much as 4 mm/yr. An areal source zone representing background earthquakes up to M{sub w} 6 1/4 = 1/4 was also included in the analysis. Recurrence for these background events was based on the 1904--1994 historical record, which contains events up to M{sub w} 5.6. Based on this analysis, the peak horizontal rock accelerations are 0.16, 0.21, 0.28, and 0.50 g for return periods of 500, 1,000, 2,000, and 10,000 years, respectively. In general, the dominant contributor to the ground shaking hazard at Yucca Mountain are background earthquakes because of the low slip rates of the Basin and Range faults. A significant effect on the probabilistic ground motions is due to the inclusion of a new attenuation relation developed specifically for earthquakes in extensional tectonic regimes. This relation gives significantly lower peak accelerations than five other predominantly California-based relations used in the analysis, possibly due to the lower stress drops of extensional earthquakes compared to California events. Because Las Vegas is located within the same tectonic regime as Yucca Mountain, the seismic sources and path and site factors affecting the seismic hazard at Yucca Mountain also have implications to Las Vegas. These implications are discussed in this paper.

  10. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    Science.gov (United States)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal

  11. CSEP-Japan: The Japanese node of the collaboratory for the study of earthquake predictability

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2011-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project of earthquake predictability research. The final goal of this project is to have a look for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined the CSEP and started the Japanese testing center called as CSEP-Japan. This testing center constitutes an open access to researchers contributing earthquake forecast models for applied to Japan. A total of 91 earthquake forecast models were submitted on the prospective experiment starting from 1 November 2009. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by the CSEP. The experiments of 1-day, 3-month, 1-year and 3-year forecasting classes were implemented for 92 rounds, 4 rounds, 1round and 0 round (now in progress), respectively. The results of the 3-month class gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space-distribution with most models in some cases where many earthquakes occurred at the same spot. Throughout the experiment, it has been clarified that some properties of the CSEP's evaluation tests such as the L-test show strong correlation with the N-test. We are now processing to own (cyber-) infrastructure to support the forecast experiment as follows. (1) Japanese seismicity has changed since the 2011 Tohoku earthquake. The 3rd call for forecasting models was announced in order to promote model improvement for forecasting earthquakes after this earthquake. So, we provide Japanese seismicity catalog maintained by JMA for modelers to study how seismicity

  12. Approach to seismic hazard analysis for dam safety in the Sierra Nevada and Modoc Plateau of California

    International Nuclear Information System (INIS)

    Savage, W.U.; McLaren, M.K.; Edwards, W.D.; Page, W.D.

    1991-01-01

    Pacific Gas and Electric Company's hydroelectric generating system involves about 150 dams located in the Sierra Nevada and Modoc Plateau region of central and northern California. The utility's strategy for earthquake hazard assessment is described. The approach includes the following strategies: integrating regional tectonics, seismic geology, historical seismicity, microseismicity, and crustal structure to form a comprehensive regional understanding of the neotectonic setting; performing local studies to acquire data as needed to reduce uncertainties in geologic and seismic parameters of fault characteristics near specific dam sites; applying and extending recently developed geologic, seismologic, and earthquake engineering technologies to the current regional and site-specific information to evaluate fault characteristics, to estimate maximum earthquakes, and to characterize ground motion; and encouraging multiple independent reviews of earthquake hazard studies by conducting peer reviews, making field sites available to regulating agencies, and publishing results, methods and data in open literature. 46 refs., 8 tabs

  13. The Great California ShakeOut: Science-Based Preparedness Advocacy

    Science.gov (United States)

    Benthien, M. L.

    2009-12-01

    The Great Southern California ShakeOut in November 2008 was the largest earthquake drill in U.S. history, involving over 5 million southern Californians through a broad-based outreach program, media partnerships, and public advocacy by hundreds of partners. The basis of the drill was a comprehensive scenario for a magnitude 7.8 earthquake on the southern San Andreas fault, which would cause broad devastation. In early 2009 the decision was made to hold the drill statewide on the third Thursday of October each year (October 15 in 2009). Results of the 2008 and 2009 drills will be shared in this session. In addition, prospects of early warning systems will be described, that will one day provide the needed seconds before strong shaking arrives in which critical systems and be shut down, and people can do what they've been practicing in the ShakeOut drills: drop, cover, and hold on. A key aspect of the ShakeOut is the integration of a comprehensive earthquake scenario (incorporating earth science, engineering, policy, economics, public health, and other disciplines) and the lessons learned from decades of social science research about why people get prepared. The result is a “teachable moment” on par with having an actual earthquake (often followed by increased interest in getting ready for earthquakes). ShakeOut creates the sense of urgency that is needed for people, organizations, and communities to get prepared, to practice what to do to be safe, and to learn what plans need to be improved.

  14. Characterizing the recent behavior and earthquake potential of the blind western San Cayetano and Ventura fault systems

    Science.gov (United States)

    McAuliffe, L. J.; Dolan, J. F.; Hubbard, J.; Shaw, J. H.

    2011-12-01

    The recent occurrence of several destructive thrust fault earthquakes highlights the risks posed by such events to major urban centers around the world. In order to determine the earthquake potential of such faults in the western Transverse Ranges of southern California, we are studying the activity and paleoearthquake history of the blind Ventura and western San Cayetano faults through a multidisciplinary analysis of strata that have been folded above the fault tiplines. These two thrust faults form the middle section of a >200-km-long, east-west belt of large, interconnected reverse faults that extends across southern California. Although each of these faults represents a major seismic source in its own right, we are exploring the possibility of even larger-magnitude, multi-segment ruptures that may link these faults to other major faults to the east and west in the Transverse Ranges system. The proximity of this large reverse-fault system to several major population centers, including the metropolitan Los Angeles region, and the potential for tsunami generation during offshore ruptures of the western parts of the system, emphasizes the importance of understanding the behavior of these faults for seismic hazard assessment. During the summer of 2010 we used a mini-vibrator source to acquire four, one- to three-km-long, high-resolution seismic reflection profiles. The profiles were collected along the locus of active folding above the blind, western San Cayetano and Ventura faults - specifically, across prominent fold scarps that have developed in response to recent slip on the underlying thrust ramps. These high-resolution data overlap with the uppermost parts of petroleum-industry seismic reflection data, and provide a near-continuous image of recent folding from several km depth to within 50-100 m of the surface. Our initial efforts to document the earthquake history and slip-rate of this large, multi-fault reverse fault system focus on a site above the blind

  15. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  16. A Crowdsourcing-based Taiwan Scientific Earthquake Reporting System

    Science.gov (United States)

    Liang, W. T.; Lee, J. C.; Lee, C. F.

    2017-12-01

    To collect immediately field observations for any earthquake-induced ground damages, such as surface fault rupture, landslide, rock fall, liquefaction, and landslide-triggered dam or lake, etc., we are developing an earthquake damage reporting system which particularly relies on school teachers as volunteers after taking a series of training courses organized by this project. This Taiwan Scientific Earthquake Reporting (TSER) system is based on the Ushahidi mapping platform, which has been widely used for crowdsourcing on different purposes. Participants may add an app-like icon for mobile devices to this website at https://ies-tser.iis.sinica.edu.tw. Right after a potential damaging earthquake occurred in the Taiwan area, trained volunteers will be notified/dispatched to the source area to carry out field surveys and to describe the ground damages through this system. If the internet is available, they may also upload some relevant images in the field right away. This collected information will be shared with all public after a quick screen by the on-duty scientists. To prepare for the next strong earthquake, we set up a specific project on TSER for sharing spectacular/remarkable geologic features wherever possible. This is to help volunteers get used to this system and share any teachable material on this platform. This experimental, science-oriented crowdsourcing system was launched early this year. Together with a DYFI-like intensity reporting system, Taiwan Quake-Catcher Network, and some online games and teaching materials, the citizen seismology has been much improved in Taiwan in the last decade. All these constructed products are now either operated or promoted at the Taiwan Earthquake Research Center (TEC). With these newly developed platforms and materials, we are aiming not only to raise the earthquake awareness and preparedness, but also to encourage public participation in earthquake science in Taiwan.

  17. Development of a State-Wide 3-D Seismic Tomography Velocity Model for California

    Science.gov (United States)

    Thurber, C. H.; Lin, G.; Zhang, H.; Hauksson, E.; Shearer, P.; Waldhauser, F.; Hardebeck, J.; Brocher, T.

    2007-12-01

    We report on progress towards the development of a state-wide tomographic model of the P-wave velocity for the crust and uppermost mantle of California. The dataset combines first arrival times from earthquakes and quarry blasts recorded on regional network stations and travel times of first arrivals from explosions and airguns recorded on profile receivers and network stations. The principal active-source datasets are Geysers-San Pablo Bay, Imperial Valley, Livermore, W. Mojave, Gilroy-Coyote Lake, Shasta region, Great Valley, Morro Bay, Mono Craters-Long Valley, PACE, S. Sierras, LARSE 1 and 2, Loma Prieta, BASIX, San Francisco Peninsula and Parkfield. Our beta-version model is coarse (uniform 30 km horizontal and variable vertical gridding) but is able to image the principal features in previous separate regional models for northern and southern California, such as the high-velocity subducting Gorda Plate, upper to middle crustal velocity highs beneath the Sierra Nevada and much of the Coast Ranges, the deep low-velocity basins of the Great Valley, Ventura, and Los Angeles, and a high- velocity body in the lower crust underlying the Great Valley. The new state-wide model has improved areal coverage compared to the previous models, and extends to greater depth due to the data at large epicentral distances. We plan a series of steps to improve the model. We are enlarging and calibrating the active-source dataset as we obtain additional picks from investigators and perform quality control analyses on the existing and new picks. We will also be adding data from more quarry blasts, mainly in northern California, following an identification and calibration procedure similar to Lin et al. (2006). Composite event construction (Lin et al., in press) will be carried out for northern California for use in conventional tomography. A major contribution of the state-wide model is the identification of earthquakes yielding arrival times at both the Northern California Seismic

  18. Modeling of periodic great earthquakes on the San Andreas fault: Effects of nonlinear crustal rheology

    Science.gov (United States)

    Reches, Ze'ev; Schubert, Gerald; Anderson, Charles

    1994-01-01

    We analyze the cycle of great earthquakes along the San Andreas fault with a finite element numerical model of deformation in a crust with a nonlinear viscoelastic rheology. The viscous component of deformation has an effective viscosity that depends exponentially on the inverse absolute temperature and nonlinearity on the shear stress; the elastic deformation is linear. Crustal thickness and temperature are constrained by seismic and heat flow data for California. The models are for anti plane strain in a 25-km-thick crustal layer having a very long, vertical strike-slip fault; the crustal block extends 250 km to either side of the fault. During the earthquake cycle that lasts 160 years, a constant plate velocity v(sub p)/2 = 17.5 mm yr is applied to the base of the crust and to the vertical end of the crustal block 250 km away from the fault. The upper half of the fault is locked during the interseismic period, while its lower half slips at the constant plate velocity. The locked part of the fault is moved abruptly 2.8 m every 160 years to simulate great earthquakes. The results are sensitive to crustal rheology. Models with quartzite-like rheology display profound transient stages in the velocity, displacement, and stress fields. The predicted transient zone extends about 3-4 times the crustal thickness on each side of the fault, significantly wider than the zone of deformation in elastic models. Models with diabase-like rheology behave similarly to elastic models and exhibit no transient stages. The model predictions are compared with geodetic observations of fault-parallel velocities in northern and central California and local rates of shear strain along the San Andreas fault. The observations are best fit by models which are 10-100 times less viscous than a quartzite-like rheology. Since the lower crust in California is composed of intermediate to mafic rocks, the present result suggests that the in situ viscosity of the crustal rock is orders of magnitude

  19. The International Platform on Earthquake Early Warning Systems (IP-EEWS)

    Science.gov (United States)

    Torres, Jair; Fanchiotti, Margherita

    2017-04-01

    The Sendai Framework for Disaster Risk Reduction 2015-2030 recognizes the need to "substantially increase the availability of and access to multi-hazard early warning systems and disaster risk information and assessments to the people by 2030" as one of its global targets (target "g"). While considerable progress has been made in recent decades, early warning systems (EWSs) continue to be less developed for geo-hazards and significant challenges remain in advancing the development of EWSs for specific hazards, particularly for fastest onset hazards such as earthquakes. An earthquake early warning system (EEWS) helps in disseminating timely information about potentially catastrophic earthquake hazards to the public, emergency managers and the private sector to provide enough time to implement automatized emergency measures. At the same time, these systems help to reduce considerably the CO2 emissions produced by the catastrophic impacts and subsequent effects of earthquakes, such as those generated by fires, collapses, and pollution (among others), as well as those produced in the recovery and reconstruction processes. In recent years, EEWSs have been developed independently in few countries: EEWSs have shown operational in Japan and Mexico, while other regions in California (USA), Turkey, Italy, Canada, South Korea and China (including Taiwan) are in the development stages or under restricted applications. Many other countries in the Indian Subcontinent, Southeast Asia, Central Asia, Middle East, Eastern Africa, Southeast Africa, as well as Central America, South America and the Caribbean, are located in some of the most seismically active regions in the world, or present moderate seismicity but high vulnerability, and would strongly benefit from the development of EEWSs. Given that, in many instances, the development of an EEWS still requires further testing, increased density coverage in seismic observation stations, regional coordination, and further scientific

  20. A reevaluation of the Pallett Creek earthquake chronology based on new AMS radiocarbon dates, San Andreas fault, California

    Science.gov (United States)

    Scharer, K.M.; Biasi, G.P.; Weldon, R.J.

    2011-01-01

    The Pallett Creek paleoseismic record occupies a keystone position in most attempts to develop rupture histories for the southern San Andreas fault. Previous estimates of earthquake ages at Pallett Creek were determined by decay counting radiocarbon methods. That method requires large samples which can lead to unaccounted sources of uncertainty in radiocarbon ages because of the heterogeneous composition of organic layers. In contrast, accelerator mass spectrometry (AMS) radiocarbon dates may be obtained from small samples that have known carbon sources and also allow for a more complete sampling of the section. We present 65 new AMS radiocarbon dates that span nine ground-rupturing earthquakes at Pallett Creek. Overall, the AMS dates are similar to and reveal no dramatic bias in the conventional dates. For many layers, however, individual charcoal samples were younger than the conventional dates, leading to earthquake ages that are overall slightly younger than previously reported. New earthquake ages are determined by Bayesian refinement of the layer ages based on stratigraphic ordering and sedimentological constraints. The new chronology is more regular than previously published records in large part due to new samples constraining the age of event R. The closed interval from event C to 1857 has a mean recurrence of 135years (?? = 83.2 years) and a quasiperiodic coefficient of variation (COV) of 0.61. We show that the new dates and resultant earthquake chronology have a stronger effect on COV than the specific membership of this long series and dating precision improvements from sedimentation rates. Copyright 2011 by the American Geophysical Union.

  1. A strong-motion hot spot of the 2016 Meinong, Taiwan, earthquake (Mw = 6.4

    Directory of Open Access Journals (Sweden)

    Hiroo Kanamori

    2017-01-01

    Full Text Available Despite a moderate magnitude, Mw = 6.4, the 5 February 2016 Meinong, Taiwan, earthquake caused significant damage in Tainan City and the surrounding areas. Several seismograms display an impulsive S-wave velocity pulse with an amplitude of about 1 m s-1, which is similar to large S-wave pulses recorded for the past several larger damaging earthquakes, such as the 1995 Kobe, Japan, earthquake (Mw = 6.9 and the 1994 Northridge, California, earthquake (Mw = 6.7. The observed PGV in the Tainan area is about 10 times larger than the median PGV of Mw = 6.4 crustal earthquakes in Taiwan. We investigate the cause of the localized strong ground motions. The peak-to-peak ground-motion displacement at the basin sites near Tainan is about 35 times larger than that at a mountain site with a similar epicentral distance. At some frequency bands (0.9 - 1.1 Hz, the amplitude ratio is as large as 200. Using the focal mechanism of this earthquake, typical “soft” and “hard” crustal structures, and directivity inferred from the observed waveforms and the slip distribution, we show that the combined effect yields an amplitude ratio of 17 to 34. The larger amplitude ratios at higher frequency bands can be probably due to the effects of complex 3-D basin structures. The result indicates that even from a moderate event, if these effects simultaneously work together toward amplifying ground motions, the extremely large ground motions as observed in Tainan can occur. Such occurrences should be taken into consideration in hazard mitigation measures in the place with frequent moderate earthquakes.

  2. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  3. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    Science.gov (United States)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  4. Audio-based, unsupervised machine learning reveals cyclic changes in earthquake mechanisms in the Geysers geothermal field, California

    Science.gov (United States)

    Holtzman, B. K.; Paté, A.; Paisley, J.; Waldhauser, F.; Repetto, D.; Boschi, L.

    2017-12-01

    The earthquake process reflects complex interactions of stress, fracture and frictional properties. New machine learning methods reveal patterns in time-dependent spectral properties of seismic signals and enable identification of changes in faulting processes. Our methods are based closely on those developed for music information retrieval and voice recognition, using the spectrogram instead of the waveform directly. Unsupervised learning involves identification of patterns based on differences among signals without any additional information provided to the algorithm. Clustering of 46,000 earthquakes of $0.3

  5. Framework for Developing Economic Competitiveness Measures for the California Sustainable Freight Action Plan.

    Science.gov (United States)

    2017-07-04

    The METRANS Transportation Center has been providing technical assistance to the California Governors Office of Business and Economic Development (GO-Biz) and the California Air Resources Board (CARB) in support of implementing the California Sust...

  6. EXPOSURES AND HEALTH OF FARM WORKER CHILDREN IN CALIFORNIA

    Science.gov (United States)

    The EPA STAR Program Center of Excellence in Children's Environmental Health and Disease Prevention Research at the University of California at Berkeley is currently conducting exposure and health studies for children of farm workers in the Salinas Valley of California. The Exp...

  7. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  8. Analysis spectral shapes from California and central United States ground motion

    International Nuclear Information System (INIS)

    1994-01-01

    The objective of this study is to analyze the spectral shapes from earthquake records with magnitudes and distances comparable to those that dominate seismic hazard at Oak Ridge, in order to provide guidance for the selection of site-specific design-spectrum shapes for use in Oak Ridge. The authors rely heavily on California records because the number of relevant records from the central and eastern United States (CEUS) is not large enough for drawing statistically significant conclusions. They focus on the 0.5 to 10-Hz frequency range for two reasons: (1) this is the frequency range of most engineering interest, and (2) they avoid the effect of well-known differences in the high-frequency energy content between California and CEUS ground motions

  9. Expanding the Delivery of Rapid Earthquake Information and Warnings for Response and Recovery

    Science.gov (United States)

    Blanpied, M. L.; McBride, S.; Hardebeck, J.; Michael, A. J.; van der Elst, N.

    2017-12-01

    Scientific organizations like the United States Geological Survey (USGS) release information to support effective responses during an earthquake crisis. Information is delivered to the White House, the National Command Center, the Departments of Defense, Homeland Security (including FEMA), Transportation, Energy, and Interior. Other crucial stakeholders include state officials and decision makers, emergency responders, numerous public and private infrastructure management centers (e.g., highways, railroads and pipelines), the media, and the public. To meet the diverse information requirements of these users, rapid earthquake notifications have been developed to be delivered by e-mail and text message, as well as a suite of earthquake information resources such as ShakeMaps, Did You Feel It?, PAGER impact estimates, and data are delivered via the web. The ShakeAlert earthquake early warning system being developed for the U.S. West Coast will identify and characterize an earthquake a few seconds after it begins, estimate the likely intensity of ground shaking, and deliver brief but critically important warnings to people and infrastructure in harm's way. Currently the USGS is also developing a capability to deliver Operational Earthquake Forecasts (OEF). These provide estimates of potential seismic behavior after large earthquakes and during evolving aftershock sequences. Similar work is underway in New Zealand, Japan, and Italy. In the development of OEF forecasts, social science research conducted during these sequences indicates that aftershock forecasts are valued for a variety of reasons, from informing critical response and recovery decisions to psychologically preparing for more earthquakes. New tools will allow users to customize map-based, spatiotemporal forecasts to their specific needs. Hazard curves and other advanced information will also be available. For such authoritative information to be understood and used during the pressures of an earthquake

  10. Retrospective evaluation of the five-year and ten-year CSEP-Italy earthquake forecasts

    Directory of Open Access Journals (Sweden)

    Stefan Wiemer

    2010-11-01

    Full Text Available On August 1, 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP launched a prospective and comparative earthquake predictability experiment in Italy. The goal of this CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented 18 five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We have considered here the twelve time-independent earthquake forecasts among this set, and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. We present the results of the tests that measure the consistencies of the forecasts according to past observations. As well as being an evaluation of the time-independent forecasts submitted, this exercise provides insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between robustness of results and experiment duration. We conclude with suggestions for the design of future earthquake predictability experiments.

  11. Heterogeneous rupture in the great Cascadia earthquake of 1700 inferred from coastal subsidence estimates

    Science.gov (United States)

    Wang, Pei-Ling; Engelhart, Simon E.; Wang, Kelin; Hawkes, Andrea D.; Horton, Benjamin P.; Nelson, Alan R.; Witter, Robert C.

    2013-01-01

    Past earthquake rupture models used to explain paleoseismic estimates of coastal subsidence during the great A.D. 1700 Cascadia earthquake have assumed a uniform slip distribution along the megathrust. Here we infer heterogeneous slip for the Cascadia margin in A.D. 1700 that is analogous to slip distributions during instrumentally recorded great subduction earthquakes worldwide. The assumption of uniform distribution in previous rupture models was due partly to the large uncertainties of then available paleoseismic data used to constrain the models. In this work, we use more precise estimates of subsidence in 1700 from detailed tidal microfossil studies. We develop a 3-D elastic dislocation model that allows the slip to vary both along strike and in the dip direction. Despite uncertainties in the updip and downdip slip extensions, the more precise subsidence estimates are best explained by a model with along-strike slip heterogeneity, with multiple patches of high-moment release separated by areas of low-moment release. For example, in A.D. 1700, there was very little slip near Alsea Bay, Oregon (~44.4°N), an area that coincides with a segment boundary previously suggested on the basis of gravity anomalies. A probable subducting seamount in this area may be responsible for impeding rupture during great earthquakes. Our results highlight the need for more precise, high-quality estimates of subsidence or uplift during prehistoric earthquakes from the coasts of southern British Columbia, northern Washington (north of 47°N), southernmost Oregon, and northern California (south of 43°N), where slip distributions of prehistoric earthquakes are poorly constrained.

  12. Electromagnetic Signals and Earthquakes 2.0: Increasing Signals and Reducing Noise

    Science.gov (United States)

    Dunson, J. C.; Bleier, T.; Heraud, J. A.; Muller, S.; Lindholm, C.; Christman, L.; King, R.; Lemon, J.

    2013-12-01

    QuakeFinder has an international network of 150+ Magnetometers and air conductivity instruments located in California, Peru, Chile, Taiwan, and Greece. Since 2000, QuakeFinder has been collecting electromagnetic data and applying simple algorithms to identify and characterize electromagnetic signals that occur in the few weeks prior to earthquakes greater than M4.5. In this presentation, we show refinements to several aspects of our signal identification techniques that enhance detection of pre-earthquake patterns. Our magnetometers have been improved to show longer pulses, and we are now using second generation algorithms that have been refined to detect the proper shape of the earthquake-generated pulses and to allow individual site adjustments. Independent lightning strike data has also now been included to mask out lightning based on amplitude and distance from a given instrument site. Direction of arrival (Azimuth) algorithms have been added to identify patterns of pulse clustering that occur prior to nearby earthquakes. Likewise, positive and negative air ion concentration detection has been improved by building better enclosures, using stainless screens to eliminate insects and some dirt sources, conformal coating PC boards to reduce moisture contamination, and filtering out contaminated data segments based on relative humidity measurements at each site. Infra Red data from the western GOES satellite has been time-filtered, cloud-filtered, and compared to 3 year averages of each pixel's output (by seasonal month) to arrive at a relevant comparison baseline for each night's temperature/cooling slope. All these efforts have helped improve the detection of multiple, nearly simultaneous, electromagnetic signals due to earthquake preparation processes, while reducing false positive indications due to environmental noise sources.

  13. Spatiotemporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach

    Science.gov (United States)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-07-01

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog's inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable with respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.

  14. Romanian earthquakes analysis using BURAR seismic array

    International Nuclear Information System (INIS)

    Borleanu, Felix; Rogozea, Maria; Nica, Daniela; Popescu, Emilia; Popa, Mihaela; Radulian, Mircea

    2008-01-01

    Bucovina seismic array (BURAR) is a medium-aperture array, installed in 2002 in the northern part of Romania (47.61480 N latitude, 25.21680 E longitude, 1150 m altitude), as a result of the cooperation between Air Force Technical Applications Center, USA and National Institute for Earth Physics, Romania. The array consists of ten elements, located in boreholes and distributed over a 5 x 5 km 2 area; nine with short-period vertical sensors and one with a broadband three-component sensor. Since the new station has been operating the earthquake survey of Romania's territory has been significantly improved. Data recorded by BURAR during 01.01.2005 - 12.31.2005 time interval are first processed and analyzed, in order to establish the array detection capability of the local earthquakes, occurred in different Romanian seismic zones. Subsequently a spectral ratios technique was applied in order to determine the calibration relationships for magnitude, using only the information gathered by BURAR station. The spectral ratios are computed relatively to a reference event, considered as representative for each seismic zone. This method has the advantage to eliminate the path effects. The new calibration procedure is tested for the case of Vrancea intermediate-depth earthquakes and proved to be very efficient in constraining the size of these earthquakes. (authors)

  15. Determination of sound types and source levels of airborne vocalizations by California sea lions, Zalophus californianus, in rehabilitation at the Marine Mammal Center in Sausalito, California

    Science.gov (United States)

    Schwalm, Afton Leigh

    California sea lions (Zalophus californianus) are a highly popular and easily recognized marine mammal in zoos, aquariums, circuses, and often seen by ocean visitors. They are highly vocal and gregarious on land. Surprisingly, little research has been performed on the vocalization types, source levels, acoustic properties, and functions of airborne sounds used by California sea lions. This research on airborne vocalizations of California sea lions will advance the understanding of this aspect of California sea lions communication, as well as examine the relationship between health condition and acoustic behavior. Using a PhillipsRTM digital recorder with attached microphone and a calibrated RadioShackRTM sound pressure level meter, acoustical data were recorded opportunistically on California sea lions during rehabilitation at The Marine Mammal Center in Sausalito, CA. Vocalizations were analyzed using frequency, time, and amplitude variables with Raven Pro: Interactive Sound Analysis Software Version 1.4 (The Cornell Lab of Ornithology, Ithaca, NY). Five frequency, three time, and four amplitude variables were analyzed for each vocalization. Differences in frequency, time, and amplitude variables were not significant by sex. The older California sea lion group produced vocalizations that were significantly lower in four frequency variables, significantly longer in two time variables, significantly higher in calibrated maximum and minimum amplitude variables, and significantly lower in frequency at maximum and minimum amplitude compared with pups. Six call types were identified: bark, goat, growl/grumble, bark/grumble, bark/growl, and grumble/moan. The growl/grumble call was higher in dominant beginning, ending, and minimum frequency, as well as in the frequency at maximum amplitude compared with the bark, goat, bark/grumble calls in the first versus last vocalization sample. The goat call was significantly higher in first harmonic interval than any other call type

  16. Utility of temporary aftershock warning system in the immediate aftermath of large damaging earthquakes

    International Nuclear Information System (INIS)

    Harben, P.E.; Jarpe, S.P.; Hunter, S.; Johnston, C.A.

    1993-01-01

    An aftershock warning system (AWS) is a real-time warning system that is deployed immediately after a large damaging earthquake in the epicentral region of the main shock. The primary purpose of such a system is to warn rescue teams and workers within damaged structures of imminent destructive shaking. The authors have examined the utility of such a system (1) by evaluating historical data, and (2) by developing and testing a prototype system during the 1992 Landers, California, aftershock sequence. Analyzing historical data is important in determining when and where damaging aftershocks are likely to occur and the probable usefulness of an AWS in a particular region. As part of this study, they analyzed the spatial and temporal distribution of large (magnitude >5.0) aftershocks from earthquakes with magnitudes >6.0 that took place between 1942 and 1991 in California and Nevada. They found that one-quarter of these large aftershocks occurred from 2 days-2 months after the main event, nearly one-half occurred within the first two days of the main event, and greater than one-half occurred within 20 km of the main shock's epicenter. They also reviewed a case study of the 1985 Mexico City earthquake, which showed that an AWS could have given Mexico City a warning of ∼60 sec before the magnitude 7.6 aftershock that occurred 36 hr. after the main event. They deployed a four-station prototype AWS near Landers after a magnitude 7.4 earthquake occurred on June 28, 1992. The aftershock data, collected from July 3-10, showed that the aftershocks in the vicinity of the four stations varied in magnitude from 3.0-4.4. Using a two-station detection criterion to minimize false alarms, this AWS reliably discriminated between smaller and larger aftershocks within 3 sec of the origin time of the events. This prototype could have provided 6 sec of warning to Palm Springs and 20 sec of warning to San Bernardino of aftershocks occurring in the main-shock epicentral region

  17. Federal Labs and Research Centers Benefiting California: 2017 Impact Report for State Leaders.

    Energy Technology Data Exchange (ETDEWEB)

    Koning, Patricia Brady [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-12-01

    Sandia National Laboratories is the largest of the Department of Energy national laboratories with more than 13,000 staff spread across its two main campuses in New Mexico and California. For more than 60 years, the Sandia National Laboratories campus in Livermore, California has delivered cutting-edge science and technology solutions to resolve the nation’s most challenging and complex problems. As a multidisciplinary laboratory, Sandia draws from virtually every science and engineering discipline to address challenges in energy, homeland security, cybersecurity, climate, and biosecurity. Today, collaboration is vital to ensuring that the Lab stays at the forefront of science and technology innovation. Partnerships with industry, state, and local governments, and California universities help drive innovation and economic growth in the region. Sandia contributed to California’s regional and statewide economy with more than $145 million in contracts to California companies, $92 million of which goes to California small businesses. In addition, Sandia engages the community directly by running robust STEM education programs for local schools and administering community giving programs. Meanwhile, investments like the Livermore Valley Open Campus (LVOC), an innovation hub supported by LLNL and Sandia, help catalyze the local economy.

  18. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  19. Can diligent and extensive mapping of faults provide reliable estimates of the expected maximum earthquakes at these faults? No. (Invited)

    Science.gov (United States)

    Bird, P.

    2010-12-01

    The hope expressed in the title question above can be contradicted in 5 ways, listed below. To summarize, an earthquake rupture can be larger than anticipated either because the fault system has not been fully mapped, or because the rupture is not limited to the pre-existing fault network. 1. Geologic mapping of faults is always incomplete due to four limitations: (a) Map-scale limitation: Faults below a certain (scale-dependent) apparent offset are omitted; (b) Field-time limitation: The most obvious fault(s) get(s) the most attention; (c) Outcrop limitation: You can't map what you can't see; and (d) Lithologic-contrast limitation: Intra-formation faults can be tough to map, so they are often assumed to be minor and omitted. If mapping is incomplete, fault traces may be longer and/or better-connected than we realize. 2. Fault trace “lengths” are unreliable guides to maximum magnitude. Fault networks have multiply-branching, quasi-fractal shapes, so fault “length” may be meaningless. Naming conventions for main strands are unclear, and rarely reviewed. Gaps due to Quaternary alluvial cover may not reflect deeper seismogenic structure. Mapped kinks and other “segment boundary asperities” may be only shallow structures. Also, some recent earthquakes have jumped and linked “separate” faults (Landers, California 1992; Denali, Alaska, 2002) [Wesnousky, 2006; Black, 2008]. 3. Distributed faulting (“eventually occurring everywhere”) is predicted by several simple theories: (a) Viscoelastic stress redistribution in plate/microplate interiors concentrates deviatoric stress upward until they fail by faulting; (b) Unstable triple-junctions (e.g., between 3 strike-slip faults) in 2-D plate theory require new faults to form; and (c) Faults which appear to end (on a geologic map) imply distributed permanent deformation. This means that all fault networks evolve and that even a perfect fault map would be incomplete for future ruptures. 4. A recent attempt

  20. Directional topographic site response at Tarzana observed in aftershocks of the 1994 Northridge, California, earthquake: Implications for mainshock motions

    Science.gov (United States)

    Spudich, P.; Hellweg, M.; Lee, W.H.K.

    1996-01-01

    The Northridge earthquake caused 1.78 g acceleration in the east-west direction at a site in Tarzana, California, located about 6 km south of the mainshock epicenter. The accelerograph was located atop a hill about 15-m high, 500-m long, and 130-m wide, striking about N78??E. During the aftershock sequence, a temporary array of 21 three-component geophones was deployed in six radial lines centered on the accelerograph, with an average sensor spacing of 35 m. Station COO was located about 2 m from the accelerograph. We inverted aftershock spectra to obtain average relative site response at each station as a function of direction of ground motion. We identified a 3.2-Hz resonance that is a transverse oscillation of the hill (a directional topographic effect). The top/base amplification ratio at 3.2 Hz is about 4.5 for horizontal ground motions oriented approximately perpendicular to the long axis of the hill and about 2 for motions parallel to the hill. This resonance is seen most strongly within 50 m of COO. Other resonant frequencies were also observed. A strong lateral variation in attenuation, probably associated with a fault, caused substantially lower motion at frequencies above 6 Hz at the east end of the hill. There may be some additional scattered waves associated with the fault zone and seen at both the base and top of the hill, causing particle motions (not spectral ratios) at the top of the hill to be rotated about 20?? away from the direction transverse to the hill. The resonant frequency, but not the amplitude, of our observed topographic resonance agrees well with theory, even for such a low hill. Comparisons of our observations with theoretical results indicate that the 3D shape of the hill and its internal structure are important factors affecting its response. The strong transverse resonance of the hill does not account for the large east-west mainshock motions. Assuming linear soil response, mainshock east-west motions at the Tarzana accelerograph

  1. Discussing epigenetics in Southern California

    Science.gov (United States)

    2012-01-01

    With the goal of discussing how epigenetic control and chromatin remodeling contribute to the various processes that lead to cellular plasticity and disease, this symposium marks the collaboration between the Institut National de la Santé et de la Recherche Médicale (INSERM) in France and the University of California, Irvine (UCI). Organized by Paolo Sassone-Corsi (UCI) and held at the Beckman Center of the National Academy of Sciences at the UCI campus December 15–16, 2011, this was the first of a series of international conferences on epigenetics dedicated to the scientific community in Southern California. The meeting also served as the official kick off for the newly formed Center for Epigenetics and Metabolism at the School of Medicine, UCI (http://cem.igb.uci.edu). PMID:22414797

  2. Earthquakes resistance of the CEA/Cadarache facilities

    International Nuclear Information System (INIS)

    2001-01-01

    The Cadarache Center presents three nuclear types installations: experimental reactors, fuel cycle research laboratories, radioactive wastes processing and wastes encapsulation or solidification. The evolution of the standards in the seismic risks domain, led to a new assessment of these installations earthquakes resistance. This report takes stock on the situation at the end of the year 2000. (A.L.B.)

  3. Impending ionospheric anomaly preceding the Iquique Mw8.2 earthquake in Chile on 2014 April 1

    Science.gov (United States)

    Guo, Jinyun; Li, Wang; Yu, Hongjuan; Liu, Zhimin; Zhao, Chunmei; Kong, Qiaoli

    2015-12-01

    To investigate the coupling relationship between great earthquake and ionosphere, the GPS-derived total electron contents (TECs) by the Center for Orbit Determination in Europe and the foF2 data from the Space Weather Prediction Center were used to analyse the impending ionospheric anomalies before the Iquique Mw8.2 earthquake in Chile on 2014 April 1. Eliminating effects of the solar and geomagnetic activities on ionosphere by the sliding interquartile range with the 27-day window, the TEC analysis results represent that there were negative anomalies occurred on 15th day prior to the earthquake, and positive anomalies appeared in 5th day before the earthquake. The foF2 analysis results of ionosonde stations Jicamarca, Concepcion and Ramey show that the foF2 increased by 40, 50 and 45 per cent, respectively, on 5th day before the earthquake. The TEC anomalous distribution indicates that there was a widely TEC decrement over the epicentre with the duration of 6 hr on 15th day before the earthquake. On 5th day before the earthquake, the TEC over the epicentre increased with the amplitude of 15 TECu, and the duration exceeded 6 hr. The anomalies occurred on the side away from the equator. All TEC anomalies in these days were within the bounds of equatorial anomaly zone where should be the focal area to monitor ionospheric anomaly before strong earthquakes. The relationship between ionospheric anomalies and geomagnetic activity was detected by the cross wavelet analysis, which implied that the foF2 was not affected by the magnetic activities on 15th day and 5th day prior to the earthquake, but the TECs were partially affected by anomalous magnetic activity during some periods of 5th day prior to the earthquake.

  4. Refining Southern California Geotherms Using Seismologic, Geologic, and Petrologic Constraints

    Science.gov (United States)

    Thatcher, W. R.; Chapman, D. S.; Allam, A. A.; Williams, C. F.

    2017-12-01

    Lithospheric deformation in tectonically active regions depends on the 3D distribution of rheology, which is in turn critically controlled by temperature. Under the auspices of the Southern California Earthquake Center (SCEC) we are developing a 3D Community Thermal Model (CTM) to constrain rheology and so better understand deformation processes within this complex but densely monitored and relatively well-understood region. The San Andreas transform system has sliced southern California into distinct blocks, each with characteristic lithologies, seismic velocities and thermal structures. Guided by the geometry of these blocks we use more than 250 surface heat-flow measurements to define 13 geographically distinct heat flow regions (HFRs). Model geotherms within each HFR are constrained by averages and variances of surface heat flow q0 and the 1D depth distribution of thermal conductivity (k) and radiogenic heat production (A), which are strongly dependent on rock type. Crustal lithologies are not always well known and we turn to seismic imaging for help. We interrogate the SCEC Community Velocity Model (CVM) to determine averages and variances of Vp, Vs and Vp/Vs versus depth within each HFR. We bound (A, k) versus depth by relying on empirical relations between seismic wave speed and rock type and laboratory and modeling methods relating (A, k) to rock type. Many 1D conductive geotherms for each HFR are allowed by the variances in surface heat flow and subsurface (A, k). An additional constraint on the lithosphere temperature field is provided by comparing lithosphere-asthenosphere boundary (LAB) depths identified seismologically with those defined thermally as the depth of onset of partial melting. Receiver function studies in Southern California indicate LAB depths that range from 40 km to 90 km. Shallow LAB depths are correlated with high surface heat flow and deep LAB with low heat flow. The much-restricted families of geotherms that intersect peridotite

  5. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  6. Slip deficit on the san andreas fault at parkfield, california, as revealed by inversion of geodetic data.

    Science.gov (United States)

    Segall, P; Harris, R

    1986-09-26

    A network of geodetic lines spanning the San Andreas fault near the rupture zone of the 1966 Parkfield, California, earthquake (magnitude M = 6) has been repeatedly surveyed since 1959. In the study reported here the average rates of line-length change since 1966 were inverted to determine the distribution of interseismic slip rate on the fault. These results indicate that the Parkfield rupture surface has not slipped significantly since 1966. Comparison of the geodetically determined seismic moment of the 1966 earthquake with the interseismic slip-deficit rate suggests that the strain released by the latest shock will most likely be restored between 1984 and 1989, although this may not occur until 1995. These results lend independent support to the earlier forecast of an M = 6 earthquake near Parkfield within 5 years of 1988.

  7. Environmental Survey preliminary report, Stanford Linear Accelerator Center, Stanford, California

    Energy Technology Data Exchange (ETDEWEB)

    1988-07-01

    This report presents the preliminary findings from the first phase of the Survey of the US Department of Energy (DOE) Stanford Linear Accelerator Center (SLAC) at Stanford, California, conducted February 29 through March 4, 1988. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the SLAC. The Survey covers all environmental media and all areas of environmental regulation and is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations at the SLAC, and interviews with site personnel. The Survey team is developing a Sampling and Analysis Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by a DOE National Laboratory or a support contractor. When completed, the results will be incorporated into the Environmental Survey Interim Report for the SLAC facility. The Interim Report will reflect the final determinations of the SLAC Survey. 95 refs., 25 figs., 25 tabs.

  8. Environmental Survey preliminary report, Stanford Linear Accelerator Center, Stanford, California

    International Nuclear Information System (INIS)

    1988-07-01

    This report presents the preliminary findings from the first phase of the Survey of the US Department of Energy (DOE) Stanford Linear Accelerator Center (SLAC) at Stanford, California, conducted February 29 through March 4, 1988. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the SLAC. The Survey covers all environmental media and all areas of environmental regulation and is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations at the SLAC, and interviews with site personnel. The Survey team is developing a Sampling and Analysis Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by a DOE National Laboratory or a support contractor. When completed, the results will be incorporated into the Environmental Survey Interim Report for the SLAC facility. The Interim Report will reflect the final determinations of the SLAC Survey. 95 refs., 25 figs., 25 tabs

  9. The 2008 Wells, Nevada earthquake sequence: Source constraints using calibrated multiple event relocation and InSAR

    Science.gov (United States)

    Nealy, Jennifer; Benz, Harley M.; Hayes, Gavin; Berman, Eric; Barnhart, William

    2017-01-01

    The 2008 Wells, NV earthquake represents the largest domestic event in the conterminous U.S. outside of California since the October 1983 Borah Peak earthquake in southern Idaho. We present an improved catalog, magnitude complete to 1.6, of the foreshock-aftershock sequence, supplementing the current U.S. Geological Survey (USGS) Preliminary Determination of Epicenters (PDE) catalog with 1,928 well-located events. In order to create this catalog, both subspace and kurtosis detectors are used to obtain an initial set of earthquakes and associated locations. The latter are then calibrated through the implementation of the hypocentroidal decomposition method and relocated using the BayesLoc relocation technique. We additionally perform a finite fault slip analysis of the mainshock using InSAR observations. By combining the relocated sequence with the finite fault analysis, we show that the aftershocks occur primarily updip and along the southwestern edge of the zone of maximum slip. The aftershock locations illuminate areas of post-mainshock strain increase; aftershock depths, ranging from 5 to 16 km, are consistent with InSAR imaging, which shows that the Wells earthquake was a buried source with no observable near-surface offset.

  10. Seismic Imaging of the West Napa Fault in Napa, California

    Science.gov (United States)

    Goldman, M.; Catchings, R.; Chan, J. H.; Sickler, R. R.; Nevitt, J. M.; Criley, C.

    2017-12-01

    In October 2016, we acquired high-resolution P- and S-wave seismic data along a 120-m-long, SW-NE-trending profile in Napa, California. Our seismic survey was designed to image a strand of the West Napa Fault Zone (WNFZ), which ruptured during the 24 August 2014 Mw 6.0 South Napa Earthquake. We separately acquired P- and S-wave data at every station using multiple hammer hits, which were edited and stacked into individual shot gathers in the lab. Each shot was co-located with and recorded by 118 P-wave (40-Hz) geophones, spaced at 1 m, and by 180 S-wave (4.5-Hz) geophones, spaced at 1 m. We developed both P- and S-wave tomographic velocity models, as well as Poisson's ratio and a Vp/Vs ratio models. We observed a well-defined zone of elevated Vp/Vs ratios below about 10 m depth, centered beneath the observed surface rupture. P-wave reflection images show that the fault forms a flower-structure in the upper few tens of meters. This method has been shown to delineate fault structures even in areas of rough terrain.

  11. A new Bayesian Inference-based Phase Associator for Earthquake Early Warning

    Science.gov (United States)

    Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan

    2013-04-01

    State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check

  12. A 100-year average recurrence interval for the san andreas fault at wrightwood, california.

    Science.gov (United States)

    Fumal, T E; Schwartz, D P; Pezzopane, S K; Weldon, R J

    1993-01-08

    Evidence for five large earthquakes during the past five centuries along the San Andreas fault zone 70 kilometers northeast of Los Angeles, California, indicates that the average recurrence interval and the temporal variability are significantly smaller than previously thought. Rapid sedimentation during the past 5000 years in a 150-meter-wide structural depression has produced a greater than 21-meter-thick sequence of debris flow and stream deposits interbedded with more than 50 datable peat layers. Fault scarps, colluvial wedges, fissure infills, upward termination of ruptures, and tilted and folded deposits above listric faults provide evidence for large earthquakes that occurred in A.D. 1857, 1812, and about 1700, 1610, and 1470.

  13. How Can Museum Exhibits Enhance Earthquake and Tsunami Hazard Resiliency?

    Science.gov (United States)

    Olds, S. E.

    2015-12-01

    Creating a natural disaster-ready community requires interoperating scientific, technical, and social systems. In addition to the technical elements that need to be in place, communities and individuals need to be prepared to react when a natural hazard event occurs. Natural hazard awareness and preparedness training and education often takes place through informal learning at science centers and formal k-12 education programs as well as through awareness raising via strategically placed informational tsunami warning signs and placards. Museums and science centers are influential in raising science literacy within a community, however can science centers enhance earthquake and tsunami resiliency by providing hazard science content and preparedness exhibits? Museum docents and informal educators are uniquely situated within the community. They are transmitters and translators of science information to broad audiences. Through interaction with the public, docents are well positioned to be informants of the knowledge beliefs, and feelings of science center visitors. They themselves are life-long learners, both constantly learning from the museum content around them and sharing this content with visitors. They are also members of a community where they live. In-depth interviews with museum informal educators and docents were conducted at a science center in coastal Pacific Northwest. This region has a potential to be struck by a great 9+ Mw earthquake and subsequent tsunami. During the interviews, docents described how they applied learning from natural hazard exhibits at a science visitor center to their daily lives. During the individual interviews, the museum docents described their awareness (knowledge, attitudes, and behaviors) of natural hazards where they live and work, the feelings evoked as they learned about their hazard vulnerability, the extent to which they applied this learning and awareness to their lives, such as creating an evacuation plan, whether

  14. Spine surgery in Nepal: the 2015 earthquake.

    Science.gov (United States)

    Sutterlin, Chester E

    2015-12-01

    At noon on Saturday, 25 April 2015, a 7.8 magnitude earthquake struck Nepal. It was centered in the Himalaya northwest of Kathmandu, the capital of over 1 million people. The violent tremors were felt as far away as New Delhi, India 1,000 km from the epicenter, but the worst of its destructive force was experienced in the heavily populated Kathmandu valley and in the remote mountainous villages of the Himalaya. Ancient temples crumbled; poorly constructed buildings collapsed; men, women, and children were trapped and injured, sometimes fatally. Avalanches killed mountain climbers, Sherpa guides, and porters at Everest base camp (EBC). The death toll to date exceeds 8,600 with as many as 20,000 injured. Spinal Health International (SHI), a nonprofit volunteer organization, has been active in Nepal in past years and responded to requests by Nepali spine surgeons for assistance with traumatic spine injury victims following the earthquake. SHI volunteers were present during the 2(nd) major earthquake of magnitude 7.3 on 12 May 2015. Past and current experiences in Nepal will be presented.

  15. Seismic risk analysis for General Electric Plutonium Facility, Pleasanton, California. Final report, part II

    International Nuclear Information System (INIS)

    1980-01-01

    This report is the second of a two part study addressing the seismic risk or hazard of the special nuclear materials (SNM) facility of the General Electric Vallecitos Nuclear Center at Pleasanton, California. The Part I companion to this report, dated July 31, 1978, presented the seismic hazard at the site that resulted from exposure to earthquakes on the Calaveras, Hayward, San Andreas and, additionally, from smaller unassociated earthquakes that could not be attributed to these specific faults. However, while this study was in progress, certain additional geologic information became available that could be interpreted in terms of the existance of a nearby fault. Although substantial geologic investigations were subsequently deployed, the existance of this postulated fault, called the Verona Fault, remained very controversial. The purpose of the Part II study was to assume the existance of such a capable fault and, under this assumption, to examine the loads that the fault could impose on the SNM facility. This report first reviews the geologic setting with a focus on specifying sufficient geologic parameters to characterize the postulated fault. The report next presents the methodology used to calculate the vibratory ground motion hazard. Because of the complexity of the fault geometry, a slightly different methodology is used here compared to the Part I report. This section ends with the results of the calculation applied to the SNM facility. Finally, the report presents the methodology and results of the rupture hazard calculation

  16. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    ' (SES) data are available as in Greece, the natural time analysis of the seismicity after the initiation of the SES allows the determination of the time window of the impending mainshock through the evolution of the value of κ1 itself. It was found to work also for the 1989 M7.1 Loma Prieta earthquake. If SES data are not available, we solely rely on the evolution of the fluctuations of κ1 obtained by computing κ1 values using a natural time window of certain length sliding through the earthquake catalog. The fluctuations of the order parameter, in terms of variability, i. e., standard deviation divided by average, was found to increase dramatically when approaching the 11 March M9 super- giant earthquake. In fact, such increase was also found for M7.1 Kobe in 1995, M8.0 Tokachi-oki in 2003 and Landers and Hector-Mines earthquakes in Southern California. It is worth mentioning that such increase is obtained straghtforwardly from ordinary earthquake catalogs without any adjustable parameters.

  17. The characteristic of the building damage from historical large earthquakes in Kyoto

    Science.gov (United States)

    Nishiyama, Akihito

    2016-04-01

    The Kyoto city, which is located in the northern part of Kyoto basin in Japan, has a long history of >1,200 years since the city was initially constructed. The city has been a populated area with many buildings and the center of the politics, economy and culture in Japan for nearly 1,000 years. Some of these buildings are now subscribed as the world's cultural heritage. The Kyoto city has experienced six damaging large earthquakes during the historical period: i.e., in 976, 1185, 1449, 1596, 1662, and 1830. Among these, the last three earthquakes which caused severe damage in Kyoto occurred during the period in which the urban area had expanded. These earthquakes are considered to be inland earthquakes which occurred around the Kyoto basin. The damage distribution in Kyoto from historical large earthquakes is strongly controlled by ground condition and earthquakes resistance of buildings rather than distance from estimated source fault. Therefore, it is necessary to consider not only the strength of ground shaking but also the condition of building such as elapsed years since the construction or last repair in order to more accurately and reliably estimate seismic intensity distribution from historical earthquakes in Kyoto. The obtained seismic intensity map would be helpful for reducing and mitigating disaster from future large earthquakes.

  18. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  19. Seismogeodesy for rapid earthquake and tsunami characterization

    Science.gov (United States)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  20. The California Seafloor Mapping Program — Providing science and geospatial data for California's State Waters

    Science.gov (United States)

    Johnson, S. Y.; Cochrane, G. R.; Golden, N. E.; Dartnell, P.; Hartwell, S. R.; Cochran, S. A.; Watt, J. T.

    2017-12-01

    The California Seafloor Mapping Program (CSMP) is a collaborative effort to develop comprehensive bathymetric, geologic, and habitat maps and data for California's State Waters, which extend for 1,350 km from the shoreline to 5.6 km offshore. CSMP began in 2007 when the California Ocean Protection Council and NOAA allocated funding for high-resolution bathymetric mapping to support the California Marine Life Protection Act and update nautical charts. Collaboration and support from the USGS and other partners has led to development and dissemination of one of the world's largest seafloor-mapping datasets. CSMP data collection includes: (1) High-resolution bathymetric and backscatter mapping using swath sonar sensors; (2) "Ground-truth" imaging from a sled mounted with video and still cameras; (3) High-resolution seismic-reflection profiling at 1 km line spacing. Processed data are all publicly available. Additionally, 25 USGS map and datasets covering one third of California's coast have been published. Each publication contains 9 to 12 pdf map sheets (1:24,000 scale), an explanatory pamphlet, and a catalog of digital geospatial data layers (about 15 to 25 per map area) with web services. Map sheets display bathymetry, backscatter, perspective views, habitats, groundtruth imagery, seismic profiles, sediment distribution and thickness, and onshore-offshore geology. The CSMP goal is to serve a large constituency, ranging from senior GIS analysts in large agencies, to local governments with limited resources, to non-governmental organizations, the private sector, and concerned citizens. CSMP data and publications provide essential science and data for ocean and coastal management, stimulate and enable research, and raise public education and awareness of coastal and ocean issues. Specific applications include: Delineation and designation of marine protected areas Characterization and modeling of benthic habitats and ecosystems Updating nautical charts Earthquake hazard

  1. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  2. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)

    Science.gov (United States)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.

    2010-12-01

    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  3. Episodic radon changes in subsurface soil gas along active faults and possible relation to earthquakes

    International Nuclear Information System (INIS)

    King, C.

    1980-01-01

    Subsurface soil gas along active faults in central California has been continuously monitored by the Track Etch method to test whether its radon-isotope content may show any premonitory changes useful for earthquake prediction. The monitoring network was installed in May 1975 and has since been gradually expanded to consist of more than 60 stations along a 380-km section of the San Andreas fault system between Santa Rosa and Cholame. This network has recorded several episodes, each lasting several weeks to several months, during which the radon concentration increased by a factor of approximately 2 above average along some long, but limited, fault segments (approx.100 km). These episodes occurred in different seasons and do not appear to be systematically related to changes in meteorological conditions. However, they coincided reasonably well in time and space with larger local earthquakes above a threshold magnitude of about 4.0. These episodic radon changes may be caused by a changing outgassing rate in the fault zones in response to some episodic strain changes, which incidentally caused the earthquakes

  4. Triggered Seismicity in Utah from the November 3, 2002, Denali Fault Earthquake

    Science.gov (United States)

    Pankow, K. L.; Nava, S. J.; Pechmann, J. C.; Arabasz, W. J.

    2002-12-01

    Coincident with the arrival of the surface waves from the November 3, 2002, Mw 7.9 Denali Fault, Alaska earthquake (DFE), the University of Utah Seismograph Stations (UUSS) regional seismic network detected a marked increase in seismicity along the Intermountain Seismic Belt (ISB) in central and north-central Utah. The number of earthquakes per day in Utah located automatically by the UUSS's Earthworm system in the week following the DFE was approximately double the long-term average during the preceding nine months. From these preliminary data, the increased seismicity appears to be characterized by small magnitude events (M = 3.2) and concentrated in five distinct spatial clusters within the ISB between 38.75°and 42.0° N. The first of these earthquakes was an M 2.2 event located ~20 km east of Salt Lake City, Utah, which occurred during the arrival of the Love waves from the DFE. The increase in Utah earthquake activity at the time of the arrival of the surface waves from the DFE suggests that these surface waves triggered earthquakes in Utah at distances of more than 3,000 km from the source. We estimated the peak dynamic shear stress caused by these surface waves from measurements of their peak vector velocities at 43 recording sites: 37 strong-motion stations of the Advanced National Seismic System and six broadband stations. (The records from six other broadband instruments in the region of interest were clipped.) The estimated peak stresses ranged from 1.2 bars to 3.5 bars with a mean of 2.3 bars, and generally occurred during the arrival of Love waves of ~15 sec period. These peak dynamic shear stress estimates are comparable to those obtained from recordings of the 1992 Mw 7.3 Landers, California, earthquake in regions where the Landers earthquake triggered increased seismicity. We plan to present more complete analyses of UUSS seismic network data, further testing our hypothesis that the DFE remotely triggered seismicity in Utah. This hypothesis is

  5. Seismo-Lineament Analysis Method (SLAM) Applied to the South Napa Earthquake

    Science.gov (United States)

    Worrell, V. E.; Cronin, V. S.

    2014-12-01

    We used the seismo-lineament analysis method (SLAM; http://bearspace.baylor.edu/Vince_Cronin/www/SLAM/) to "predict" the location of the fault that produced the M 6.0 South Napa earthquake of 24 August 2014, using hypocenter and focal mechanism data from NCEDC (http://www.ncedc.org/ncedc/catalog-search.html) and a digital elevation model from the USGS National Elevation Dataset (http://viewer.nationalmap.gov/viewer/). The ground-surface trace of the causative fault (i.e., the Browns Valley strand of the West Napa fault zone; Bryant, 2000, 1982) and virtually all of the ground-rupture sites reported by the USGS and California Geological Survey (http://www.eqclearinghouse.org/2014-08-24-south-napa/) were located within the north-striking seismo-lineament. We also used moment tensors published online by the USGS and GCMT (http://comcat.cr.usgs.gov/earthquakes/eventpage/nc72282711#scientific_moment-tensor) as inputs to SLAM and found that their northwest-striking seismo-lineaments correlated spatially with the causative fault. We concluded that SLAM could have been used as soon as these mechanism solutions were available to help direct the search for the trace of the causative fault and possible rupture-related damage. We then considered whether the seismogenic fault could have been identified using SLAM prior to the 24 August event, based on the focal mechanisms of smaller prior earthquakes reported by the NCEDC or ISC (http://www.isc.ac.uk). Seismo-lineaments from three M~3.5 events from 1990 and 2012, located in the Vallejo-Crockett area, correlate spatially with the Napa County Airport strand of the West Napa fault and extend along strike toward the Browns Valley strand (Bryant, 2000, 1982). Hence, we might have used focal mechanisms from smaller earthquakes to establish that the West Napa fault is likely seismogenic prior to the South Napa earthquake. Early recognition that a fault with a mapped ground-surface trace is seismogenic, based on smaller earthquakes

  6. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  7. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. Detection and Mapping of the September 2017 Mexico Earthquakes Using DAS Fiber-Optic Infrastructure Arrays

    Science.gov (United States)

    Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.

    2017-12-01

    Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber

  9. Seismic Evidence for Conjugate Slip and Block Rotation Within the San Andreas Fault System, Southern California

    Science.gov (United States)

    Nicholson, Craig; Seeber, Leonardo; Williams, Patrick; Sykes, Lynn R.

    1986-08-01

    The pattern of seismicity in southern California indicates that much of the activity is presently occurring on secondary structures, several of which are oriented nearly orthogonal to the strikes of the major through-going faults. Slip along these secondary transverse features is predominantly left-lateral and is consistent with the reactivation of conjugate faults by the current regional stress field. Near the intersection of the San Jacinto and San Andreas faults, however, these active left-lateral faults appear to define a set of small crustal blocks, which in conjunction with both normal and reverse faulting earthquakes, suggests contemporary clockwise rotation as a result of regional right-lateral shear. Other left-lateral faults representing additional rotating block systems are identified in adjacent areas from geologic and seismologic data. Many of these structures predate the modern San Andreas system and may control the pattern of strain accumulation in southern California. Geodetic and paleomagnetic evidence confirm that block rotation by strike-slip faulting is nearly ubiquitous, particularly in areas where shear is distributed, and that it accommodates both short-term elastic and long-term nonelastic strain. A rotating block model accounts for a number of structural styles characteristic of strike-slip deformation in California, including: variable slip rates and alternating transtensional and transpressional features observed along strike of major wrench faults; domains of evenly-spaced antithetic faults that terminate against major fault boundaries; continued development of bends in faults with large lateral displacements; anomalous focal mechanisms; and differential uplift in areas otherwise expected to experience extension and subsidence. Since block rotation requires a detachment surface at depth to permit rotational movement, low-angle structures like detachments, of either local or regional extent, may be involved in the contemporary strike

  10. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  11. California Ocean Uses Atlas: Industrial sector

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  12. California Ocean Uses Atlas: Fishing sector

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  13. Earthquake Early Warning Beta Users: Java, Modeling, and Mobile Apps

    Science.gov (United States)

    Strauss, J. A.; Vinci, M.; Steele, W. P.; Allen, R. M.; Hellweg, M.

    2014-12-01

    Earthquake Early Warning (EEW) is a system that can provide a few to tens of seconds warning prior to ground shaking at a user's location. The goal and purpose of such a system is to reduce, or minimize, the damage, costs, and casualties resulting from an earthquake. A demonstration earthquake early warning system (ShakeAlert) is undergoing testing in the United States by the UC Berkeley Seismological Laboratory, Caltech, ETH Zurich, University of Washington, the USGS, and beta users in California and the Pacific Northwest. The beta users receive earthquake information very rapidly in real-time and are providing feedback on their experiences of performance and potential uses within their organization. Beta user interactions allow the ShakeAlert team to discern: which alert delivery options are most effective, what changes would make the UserDisplay more useful in a pre-disaster situation, and most importantly, what actions users plan to take for various scenarios. Actions could include: personal safety approaches, such as drop cover, and hold on; automated processes and procedures, such as opening elevator or fire stations doors; or situational awareness. Users are beginning to determine which policy and technological changes may need to be enacted, and funding requirements to implement their automated controls. The use of models and mobile apps are beginning to augment the basic Java desktop applet. Modeling allows beta users to test their early warning responses against various scenarios without having to wait for a real event. Mobile apps are also changing the possible response landscape, providing other avenues for people to receive information. All of these combine to improve business continuity and resiliency.

  14. The State of Gerontological Social Work Education in California: Implications for Curricula Evaluation

    Science.gov (United States)

    Damron-Rodriguez, JoAnn; Goodman, Catherine; Ranney, Molly; Min, Jong Won; Takahashi, Nancy

    2013-01-01

    California has actively engaged in the Hartford Geriatric Social Work Initiative. Subsequently, the California Social Work Education Center Aging Initiative conducted a university survey of gerontology education in California graduate social work schools ("N"?=?17). In 2005, students taking aging courses were 12% in comparison to a…

  15. Seismic site characterization of an urban dedimentary basin, Livermore Valley, California: Site tesponse, basin-edge-induced surface waves, and 3D simulations

    Science.gov (United States)

    Hartzell, Stephen; Leeds, Alena L.; Ramirez-Guzman, Leonardo; Allen, James P.; Schmitt, Robert G.

    2016-01-01

    Thirty‐two accelerometers were deployed in the Livermore Valley, California, for approximately one year to study sedimentary basin effects. Many local and near‐regional earthquakes were recorded, including the 24 August 2014 Mw 6.0 Napa, California, earthquake. The resulting ground‐motion data set is used to quantify the seismic response of the Livermore basin, a major structural depression in the California Coast Range Province bounded by active faults. Site response is calculated by two methods: the reference‐site spectral ratio method and a source‐site spectral inversion method. Longer‐period (≥1  s) amplification factors follow the same general pattern as Bouguer gravity anomaly contours. Site response spectra are inverted for shallow shear‐wave velocity profiles, which are consistent with independent information. Frequency–wavenumber analysis is used to analyze plane‐wave propagation across the Livermore Valley and to identify basin‐edge‐induced surface waves with back azimuths different from the source back azimuth. Finite‐element simulations in a 3D velocity model of the region illustrate the generation of basin‐edge‐induced surface waves and point out strips of elevated ground velocities along the margins of the basin.

  16. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. The Great East-Japan Earthquake and devastating tsunami. An update and lessons from the past great earthquakes in Japan since 1923

    International Nuclear Information System (INIS)

    Ishigaki, Akemi; Higashi, Hikari; Sakamoto, Takako; Shibahara, Shigeki

    2013-01-01

    Japan has a long history of fighting against great earthquakes that cause structural damage/collapses, fires and/or tsunami. On March 11, 2011 at 14:46 (Friday), the Great East-Japan Earthquake (magnitude 9.0) attacked the Tohoku region (northeastern Japan), which includes Sendai City. The earthquake generated a devastating tsunami, leading to unprecedented disasters (∼18,500 victims) in coastal areas of Iwate, Miyagi and Fukushima prefectures, despite the fact that people living in the Tohoku region are well trained for tsunami-evacuation procedures, with the mindset of ''Tsunami, ten-den-ko.'' This code means that each person should evacuate individually upon an earthquake. Sharing this rule, children and parents can escape separately from schools, houses or workplaces, without worrying about each other. The concept of ten-den-ko (individual evacuation) is helpful for people living in coastal areas of earthquake-prone zones around the world. It is also important to construct safe evacuation centers, because the March 11 th tsunami killed people who had evacuated to evacuation sites. We summarize the current conditions of people living in the disaster-stricken areas, including the consequences of the Fukushima nuclear accident. We also describe the disaster responses as the publisher of the Tohoku Journal of Experimental Medicine (TJEM), located in Sendai, with online support from Tokyo. In 1923, the Great Kanto Earthquake (magnitude 7.9) evoked a massive fire that destroyed large areas of Tokyo (∼105,000 victims), including the print company for TJEM, but the Wistar Institute printed three TJEM issues in 1923 in Philadelphia. Mutual aid relationships should be established between distant cities to survive future disasters. (author)

  18. The Great East-Japan Earthquake and devastating tsunami: an update and lessons from the past Great Earthquakes in Japan since 1923.

    Science.gov (United States)

    Ishigaki, Akemi; Higashi, Hikari; Sakamoto, Takako; Shibahara, Shigeki

    2013-04-01

    Japan has a long history of fighting against great earthquakes that cause structural damage/collapses, fires and/or tsunami. On March 11, 2011 at 14:46 (Friday), the Great East-Japan Earthquake (magnitude 9.0) attacked the Tohoku region (northeastern Japan), which includes Sendai City. The earthquake generated a devastating tsunami, leading to unprecedented disasters (~18,500 victims) in coastal areas of Iwate, Miyagi and Fukushima prefectures, despite the fact that people living in the Tohoku region are well trained for tsunami-evacuation procedures, with the mindset of "Tsunami, ten-den-ko." This code means that each person should evacuate individually upon an earthquake. Sharing this rule, children and parents can escape separately from schools, houses or workplaces, without worrying about each other. The concept of ten-den-ko (individual evacuation) is helpful for people living in coastal areas of earthquake-prone zones around the world. It is also important to construct safe evacuation centers, because the March 11(th) tsunami killed people who had evacuated to evacuation sites. We summarize the current conditions of people living in the disaster-stricken areas, including the consequences of the Fukushima nuclear accident. We also describe the disaster responses as the publisher of the Tohoku Journal of Experimental Medicine (TJEM), located in Sendai, with online support from Tokyo. In 1923, the Great Kanto Earthquake (magnitude 7.9) evoked a massive fire that destroyed large areas of Tokyo (~105,000 victims), including the print company for TJEM, but the Wistar Institute printed three TJEM issues in 1923 in Philadelphia. Mutual aid relationships should be established between distant cities to survive future disasters.

  19. Hematologic evaluation of employees with leukopenia. Naval Weapons Center, China Lake, California.

    Science.gov (United States)

    Luiken, G A; Marsh, W L; Heath, V C; Long, H L; Weatherly, T L; Seal, G M

    1988-12-01

    Evaluation of 86 employees with a history of leukopenia at the Naval Weapons Center (NWC), China Lake, California, was done by exposure questionnaires, medical histories, physical examinations, peripheral blood smear, and bone marrow evaluations, including morphologic examination, stem cell culture, and cytogenetics. Forty-eight subjects were found to be leukopenic at the time of the study, and two subjects were found to have hairy cell leukemia. All subjects had positive exposure histories and were healthy at the time of the study. Review of peripheral smears identified the patients with marrow abnormalities. Bone marrow cultures revealed several patients with possible marrow suppression. Chromosome studies were not diagnostic. Five-year follow-up health questionnaires revealed no significant health problems; the two workers with hairy cell leukemia are alive and fully functional. Leukopenia in itself does not appear to be a risk factor for poor health, and it is unknown whether or not it may be a useful screening tool to identify workers at risk in toxic environments. Careful evaluation of blood cell counts and peripheral smears should be sufficient to identify people with potential marrow abnormalities.

  20. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  1. Gas injection may have triggered earthquakes in the Cogdell oil field, Texas.

    Science.gov (United States)

    Gan, Wei; Frohlich, Cliff

    2013-11-19

    Between 1957 and 1982, water flooding was conducted to improve petroleum production in the Cogdell oil field north of Snyder, TX, and a contemporary analysis concluded this induced earthquakes that occurred between 1975 and 1982. The National Earthquake Information Center detected no further activity between 1983 and 2005, but between 2006 and 2011 reported 18 earthquakes having magnitudes 3 and greater. To investigate these earthquakes, we analyzed data recorded by six temporary seismograph stations deployed by the USArray program, and identified 93 well-recorded earthquakes occurring between March 2009 and December 2010. Relocation with a double-difference method shows that most earthquakes occurred within several northeast-southwest-trending linear clusters, with trends corresponding to nodal planes of regional focal mechanisms, possibly indicating the presence of previously unidentified faults. We have evaluated data concerning injection and extraction of oil, water, and gas in the Cogdell field. Water injection cannot explain the 2006-2011 earthquakes, especially as net volumes (injection minus extraction) are significantly less than in the 1957-1982 period. However, since 2004 significant volumes of gases including supercritical CO2 have been injected into the Cogdell field. The timing of gas injection suggests it may have contributed to triggering the recent seismic activity. If so, this represents an instance where gas injection has triggered earthquakes having magnitudes 3 and larger. Further modeling studies may help evaluate recent assertions suggesting significant risks accompany large-scale carbon capture and storage as a strategy for managing climate change.

  2. Associate Degree Nursing: Model Prerequisites Validation Study. California Community College Associate Degree Programs by The Center for Student Success, A Health Care Initiative Sponsored Project.

    Science.gov (United States)

    Phillips, Brad C.; Spurling, Steven; Armstrong, William A.

    California faces a severe nursing shortage, with the number of registered nurses far below what is required to avert a potential state health care crisis. The Associate Degree Nursing (ADN) Project is a joint project involving scholars, educational researchers, and analysts from the Center for Student Success (CSS) housed at City College of San…

  3. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity

    Science.gov (United States)

    Zoeller, G.

    2017-12-01

    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  4. Stress modulation of earthquakes: A study of long and short period stress perturbations and the crustal response

    Science.gov (United States)

    Johnson, Christopher W.

    Decomposing fault mechanical processes advances our understanding of active fault systems and properties of the lithosphere, thereby increasing the effectiveness of seismic hazard assessment and preventative measures implemented in urban centers. Along plate boundaries earthquakes are inevitable as tectonic forces reshape the Earth's surface. Earthquakes, faulting, and surface displacements are related systems that require multidisciplinary approaches to characterize deformation in the lithosphere. Modern geodetic instrumentation can resolve displacements to millimeter precision and provide valuable insight into secular deformation in near real-time. The expansion of permanent seismic networks as well as temporary deployments allow unprecedented detection of microseismic events that image fault interfaces and fracture networks in the crust. The research presented in this dissertation is at the intersection of seismology and geodesy to study the Earth's response to transient deformation and explores research questions focusing on earthquake triggering, induced seismicity, and seasonal loading while utilizing seismic data, geodetic data, and modeling tools. The focus is to quantify stress changes in the crust, explore seismicity rate variations and migration patterns, and model crustal deformation in order to characterize the evolving state of stress on faults and the migration of fluids in the crust. The collection of problems investigated all investigate the question: Why do earthquakes nucleate following a low magnitude stress perturbation? Answers to this question are fundamental to understanding the time dependent failure processes of the lithosphere. Dynamic triggering is the interaction of faults and triggering of earthquakes represents stress transferring from one system to another, at both local and remote distances [Freed, 2005]. The passage of teleseismic surface waves from the largest earthquakes produce dynamic stress fields and provides a natural

  5. Seismomagnetic effects from the long-awaited 28 September 2004 M 6.0 parkfield earthquake

    Science.gov (United States)

    Johnston, M.J.S.; Sasai, Y.; Egbert, G.D.; Mueller, R.J.

    2006-01-01

    Precise measurements of local magnetic fields have been obtained with a differentially connected array of seven synchronized proton magnetometers located along 60 km of the locked-to-creeping transition region of the San Andreas fault at Parkfield, California, since 1976. The M 6.0 Parkfield earthquake on 28 September 2004, occurred within this array and generated coseismic magnetic field changes of between 0.2 and 0.5 nT at five sites in the network. No preseismic magnetic field changes exceeding background noise levels are apparent in the magnetic data during the month, week, and days before the earthquake (or expected in light of the absence of measurable precursive deformation, seismicity, or pore pressure changes). Observations of electric and magnetic fields from 0.01 to 20 Hz are also made at one site near the end of the earthquake rupture and corrected for common-mode signals from the ionosphere/magnetosphere using a second site some 115 km to the northwest along the fault. These magnetic data show no indications of unusual noise before the earthquake in the ULF band (0.01-20 Hz) as suggested may have preceded the 1989 ML 7.1 Loma Prieta earthquake. Nor do we see electric field changes similar to those suggested to occur before earthquakes of this magnitude from data in Greece. Uniform and variable slip piezomagnetic models of the earthquake, derived from strain, displacement, and seismic data, generate magnetic field perturbations that are consistent with those observed by the magnetometer array. A higher rate of longer-term magnetic field change, consistent with increased loading in the region, is apparent since 1993. This accompanied an increased rate of secular shear strain observed on a two-color EDM network and a small network of borehole tensor strainmeters and increased seismicity dominated by three M 4.5-5 earthquakes roughly a year apart in 1992, 1993, and 1994. Models incorporating all of these data indicate increased slip at depth in the region

  6. Earthquake Early Warning ShakeAlert System: Testing and certification platform

    Science.gov (United States)

    Cochran, Elizabeth S.; Kohler, Monica D.; Given, Douglas; Guiwits, Stephen; Andrews, Jennifer; Meier, Men-Andrin; Ahmad, Mohammad; Henson, Ivan; Hartog, Renate; Smith, Deborah

    2017-01-01

    Earthquake early warning systems provide warnings to end users of incoming moderate to strong ground shaking from earthquakes. An earthquake early warning system, ShakeAlert, is providing alerts to beta end users in the western United States, specifically California, Oregon, and Washington. An essential aspect of the earthquake early warning system is the development of a framework to test modifications to code to ensure functionality and assess performance. In 2016, a Testing and Certification Platform (TCP) was included in the development of the Production Prototype version of ShakeAlert. The purpose of the TCP is to evaluate the robustness of candidate code that is proposed for deployment on ShakeAlert Production Prototype servers. TCP consists of two main components: a real‐time in situ test that replicates the real‐time production system and an offline playback system to replay test suites. The real‐time tests of system performance assess code optimization and stability. The offline tests comprise a stress test of candidate code to assess if the code is production ready. The test suite includes over 120 events including local, regional, and teleseismic historic earthquakes, recentering and calibration events, and other anomalous and potentially problematic signals. Two assessments of alert performance are conducted. First, point‐source assessments are undertaken to compare magnitude, epicentral location, and origin time with the Advanced National Seismic System Comprehensive Catalog, as well as to evaluate alert latency. Second, we describe assessment of the quality of ground‐motion predictions at end‐user sites by comparing predicted shaking intensities to ShakeMaps for historic events and implement a threshold‐based approach that assesses how often end users initiate the appropriate action, based on their ground‐shaking threshold. TCP has been developed to be a convenient streamlined procedure for objectively testing algorithms, and it has

  7. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  8. Alaska earthquake source for the SAFRR tsunami scenario: Chapter B in The SAFRR (Science Application for Risk Reduction) Tsunami Scenario

    Science.gov (United States)

    Kirby, Stephen; Scholl, David; von Huene, Roland E.; Wells, Ray

    2013-01-01

    Tsunami modeling has shown that tsunami sources located along the Alaska Peninsula segment of the Aleutian-Alaska subduction zone have the greatest impacts on southern California shorelines by raising the highest tsunami waves for a given source seismic moment. The most probable sector for a Mw ~ 9 source within this subduction segment is between Kodiak Island and the Shumagin Islands in what we call the Semidi subduction sector; these bounds represent the southwestern limit of the 1964 Mw 9.2 Alaska earthquake rupture and the northeastern edge of the Shumagin sector that recent Global Positioning System (GPS) observations indicate is currently creeping. Geological and geophysical features in the Semidi sector that are thought to be relevant to the potential for large magnitude, long-rupture-runout interplate thrust earthquakes are remarkably similar to those in northeastern Japan, where the destructive Mw 9.1 tsunamigenic earthquake of 11 March 2011 occurred. In this report we propose and justify the selection of a tsunami source seaward of the Alaska Peninsula for use in the Tsunami Scenario that is part of the U.S. Geological Survey (USGS) Science Application for Risk Reduction (SAFRR) Project. This tsunami source should have the potential to raise damaging tsunami waves on the California coast, especially at the ports of Los Angeles and Long Beach. Accordingly, we have summarized and abstracted slip distribution from the source literature on the 2011 event, the best characterized for any subduction earthquake, and applied this synoptic slip distribution to the similar megathrust geometry of the Semidi sector. The resulting slip model has an average slip of 18.6 m and a moment magnitude of Mw = 9.1. The 2011 Tohoku earthquake was not anticipated, despite Japan having the best seismic and geodetic networks in the world and the best historical record in the world over the past 1,500 years. What was lacking was adequate paleogeologic data on prehistoric earthquakes

  9. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  10. Universal Earthquake-Occurrence Jumps, Correlations with Time, and Anomalous Diffusion

    International Nuclear Information System (INIS)

    Corral, Alvaro

    2006-01-01

    Spatiotemporal properties of seismicity are investigated for a worldwide (WW) catalog and for southern California in the stationary case (SC), showing a nearly universal scaling behavior. Distributions of distances between consecutive earthquakes (jumps) are magnitude independent and show two power-law regimes, separated by jump values about 200 (WW) and 15 km (SC). Distributions of waiting times conditioned to the value of jumps show that both variables are correlated, in general, but turn out to be independent when only short or long jumps are considered. Finally, diffusion profiles are found to be independent on the magnitude, contrary to what the waiting-time distributions suggest

  11. Efficient blind search for similar-waveform earthquakes in years of continuous seismic data

    Science.gov (United States)

    Yoon, C. E.; Bergen, K.; Rong, K.; Elezabi, H.; Bailis, P.; Levis, P.; Beroza, G. C.

    2017-12-01

    Cross-correlating an earthquake waveform template with continuous seismic data has proven to be a sensitive, discriminating detector of small events missing from earthquake catalogs, but a key limitation of this approach is that it requires advance knowledge of the earthquake signals we wish to detect. To overcome this limitation, we can perform a blind search for events with similar waveforms, comparing waveforms from all possible times within the continuous data (Brown et al., 2008). However, the runtime for naive blind search scales quadratically with the duration of continuous data, making it impractical to process years of continuous data. The Fingerprint And Similarity Thresholding (FAST) detection method (Yoon et al., 2015) enables a comprehensive blind search for similar-waveform earthquakes in a fast, scalable manner by adapting data-mining techniques originally developed for audio and image search within massive databases. FAST converts seismic waveforms into compact "fingerprints", which are efficiently organized and searched within a database. In this way, FAST avoids the unnecessary comparison of dissimilar waveforms. To date, the longest duration of continuous data used for event detection with FAST was 3 months at a single station near Guy-Greenbrier, Arkansas, which revealed microearthquakes closely correlated with stages of hydraulic fracturing (Yoon et al., 2017). In this presentation we introduce an optimized, parallel version of the FAST software with improvements to the fingerprinting algorithm and the ability to detect events using continuous data from a network of stations (Bergen et al., 2016). We demonstrate its ability to detect low-magnitude earthquakes within several years of continuous data at locations of interest in California.

  12. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  13. Investigation on the Possible Relationship between Magnetic Pulsations and Earthquakes

    Science.gov (United States)

    Jusoh, M.; Liu, H.; Yumoto, K.; Uozumi, T.; Takla, E. M.; Yousif Suliman, M. E.; Kawano, H.; Yoshikawa, A.; Asillam, M.; Hashim, M.

    2012-12-01

    The sun is the main source of energy to the solar system, and it plays a major role in affecting the ionosphere, atmosphere and the earth surface. The connection between solar wind and the ground magnetic pulsations has been proven empirically by several researchers previously (H. J. Singer et al., 1977, E. W. Greenstadt, 1979, I. A. Ansari 2006 to name a few). In our preliminary statistical analysis on relationship between solar and seismic activities (Jusoh and Yumoto, 2011, Jusoh et al., 2012), we observed a high possibility of solar-terrestrial coupling. We observed high tendency of earthquakes to occur during lower phase solar cycles which significantly related with solar wind parameters (i.e solar wind dynamic pressure, speed and input energy). However a clear coupling mechanism was not established yet. To connect the solar impact on seismicity, we investigate the possibility of ground magnetic pulsations as one of the connecting agent. In our analysis, the recorded ground magnetic pulsations are analyzed at different ranges of ultra low frequency; Pc3 (22-100 mHz), Pc4 (6.7-22 mHz) and Pc5 (1.7-6.7 mHz) with the occurrence of local earthquake events at certain time periods. This analysis focuses at 2 different major seismic regions; north Japan (mid latitude) and north Sumatera, Indonesia (low latitude). Solar wind parameters were obtained from the Goddard Space Flight Center, NASA via the OMNIWeb Data Explorer and the Space Physics Data Facility. Earthquake events were extracted from the Advanced National Seismic System (ANSS) database. The localized Pc3-Pc5 magnetic pulsations data were extracted from Magnetic Data Acquisition System (MAGDAS)/Circum Pan Magnetic Network (CPMN) located at Ashibetsu (Japan); for earthquakes monitored at north Japan and Langkawi (Malaysia); for earthquakes observed at north Sumatera. This magnetometer arrays has established by International Center for Space Weather Science and Education, Kyushu University, Japan. From the

  14. Baja California: literatura y frontera

    Directory of Open Access Journals (Sweden)

    Gabriel Trujillo Muñoz

    2014-06-01

    Baja California is a region that not only has migration problems and criminal violence because of the war of drugs or is a space of border conflicts in close neighborhood with the United States of America. Baja California is too a geographic space of culture and art, of creative writing and struggle to narrate the things and persons that here live, a plain sight, like their house, like their home, like a center of creation. This text give a cultural context of the border literature in the north of Mexico like a phenomenon in notice because his own merits, books and writers.

  15. Transport woes threaten California production

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    California oil producers face a loss of production this year because of constraints on pipeline and tanker transportation to Los Angeles area refineries. The potential bottleneck is occurring at a time when Outer Continental Shelf production is near capacity from Chevron Corp.'s Point Arguello project at the same time production is increasing from Exxon Corp.'s nearby Santa Ynex Unit (SYU) expansion. Both megaprojects must compete for pipeline space with onshore crude producers, notably in California's San Joaquin Valley (SJV). Recent development limiting transportation options include: An indefinite shutdown of Four Corners Pipe Line Co.'s 50,000 b/d Line No. 1, damaged by the Jan. 17 earthquake; Loss of a tanker permit by Chevron and partners for offshore Point Arguello production; Permanent shutdown of Exxon's offshore storage and treatment (OST) facility, which since 1981 has used tankers to transport about 20,000 b/d of SYU production from the Santa Barbara Channel to Los Angeles. The OST, the first commercial floating production system in the US -- placed in the Santa Barbara Channel in 1981 after a decade of precedent setting legal and political battles -- was shut down Apr. 4. The paper discusses these production concerns, available options, the OST shutdown, and the troubled history of the OST

  16. Special issue: Terrestrial fluids, earthquakes and volcanoes: The Hiroshi Wakita volume I

    Science.gov (United States)

    Perez, Nemesio M.; King, Chi-Yu; Gurrieri, Sergio; McGee, Kenneth A.

    2006-01-01

    Terrestrial Fluids, Earthquakes and Volcanoes: The Hiroshi Wakita Volume I is a special publication to honor Professor Hiroshi Wakita for his scientific contributions. This volume consists of 17 original papers dealing with various aspects of the role of terrestrial fluids in earthquake and volcanic processes, which reflect Prof. Wakita’s wide scope of research interests.Professor Wakita co-founded the Laboratory for Earthquake Chemistry in 1978 and served as its director from 1988 until his retirement from the university in 1997. He has made the laboratory a leading world center for studying earthquakes and volcanic activities by means of geochemical and hydrological methods. Together with his research team and a number of foreign guest researchers that he attracted, he has made many significant contributions in the above-mentioned scientific fields of interest. This achievement is a testimony to not only his scientific talent, but also his enthusiasm, his open mindedness, and his drive in obtaining both human and financial support.

  17. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  18. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    Science.gov (United States)

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  19. Structural damages of L'Aquila (Italy earthquake

    Directory of Open Access Journals (Sweden)

    H. Kaplan

    2010-03-01

    Full Text Available On 6 April 2009 an earthquake of magnitude 6.3 occurred in L'Aquila city, Italy. In the city center and surrounding villages many masonry and reinforced concrete (RC buildings were heavily damaged or collapsed. After the earthquake, the inspection carried out in the region provided relevant results concerning the quality of the materials, method of construction and the performance of the structures. The region was initially inhabited in the 13th century and has many historic structures. The main structural materials are unreinforced masonry (URM composed of rubble stone, brick, and hollow clay tile. Masonry units suffered the worst damage. Wood flooring systems and corrugated steel roofs are common in URM buildings. Moreover, unconfined gable walls, excessive wall thicknesses without connection with each other are among the most common deficiencies of poorly constructed masonry structures. These walls caused an increase in earthquake loads. The quality of the materials and the construction were not in accordance with the standards. On the other hand, several modern, non-ductile concrete frame buildings have collapsed. Poor concrete quality and poor reinforcement detailing caused damage in reinforced concrete structures. Furthermore, many structural deficiencies such as non-ductile detailing, strong beams-weak columns and were commonly observed. In this paper, reasons why the buildings were damaged in the 6 April 2009 earthquake in L'Aquila, Italy are given. Some suggestions are made to prevent such disasters in the future.

  20. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  1. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  2. CISN Display Progress to Date - Reliable Delivery of Real-Time Earthquake Information, and ShakeMap to Critical End Users

    Science.gov (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Frechette, K.; Given, D.

    2003-12-01

    The California Integrated Seismic Network (CISN) has collaborated to develop a next-generation earthquake notification system that is nearing its first operations-ready release. The CISN Display actively alerts users of seismic data, and vital earthquake hazards information following a significant event. It will primarily replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering geographical seismic data to emergency operations centers, utility companies and media outlets. A subsequent goal is to provide automated access to the many Web products produced by regional seismic networks after an earthquake. Another aim is to create a highly configurable client, allowing user organizations to overlay infrastructure data critical to their roles as first-responders, or lifeline operators. And the final goal is to integrate these requirements, into a package offering several layers of reliability to ensure delivery of services. Central to the CISN Display's role as a gateway to Web-based earthquake products is its comprehensive XML-messaging schema. The message model uses many of the same attributes in the CUBE format, but extends the old standard by provisioning additional elements for products currently available, and others yet to be considered. The client consumes these XML-messages, sorts them through a resident Quake Data Merge filter, and posts updates that also include hyperlinks associated to specific event IDs on the display map. Earthquake products available for delivery to the CISN Display are ShakeMap, focal mechanisms, waveform data, felt reports, aftershock forecasts and earthquake commentaries. By design the XML-message schema can evolve as products and information needs change, without breaking existing applications that rely on it. The latest version of the CISN Display can also automatically download ShakeMaps and display shaking intensity within the GIS system. This

  3. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  4. Earthquake and volcano hazard notices: An economic evaluation of changes in risk perceptions

    Science.gov (United States)

    Bernknopf, R.L.; Brookshire, D.S.; Thayer, M.A.

    1990-01-01

    Earthquake and volcano hazard notices were issued for the Mammoth Lakes, California area by the U.S. Geological Survey under the authority granted by the Disaster Relief Act of 1974. The effects on investment, recretion visitation, and risk perceptionsare explored. The hazard notices did not affect recreation visitation, although investment was affected. A perceived loss in the market value of homes was documented. Risk perceptions were altered for property owners. Communication of the probability of an event over time would enhance hazard notices as a policy instrument and would mitigate unnecessary market perturbations. ?? 1990.

  5. Surface slip during large Owens Valley earthquakes

    KAUST Repository

    Haddon, E. K.; Amos, C. B.; Zielke, Olaf; Jayko, A. S.; Burgmann, R.

    2016-01-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from approximate to 1.0 to 6.0 m and average 3.31.1 m (2 sigma). Vertical offsets are predominantly east-down between approximate to 0.1 and 2.4 m, with a mean of 0.80.5 m. The average lateral-to-vertical ratio compiled at specific sites is approximate to 6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.41.5 m, corresponding to a geologic M-w approximate to 7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.12.0 m, 12.8 +/- 1.5 m, and 16.6 +/- 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between approximate to 0.6 and 1.6 mm/yr (1 sigma) over the late Quaternary.

  6. Surface slip during large Owens Valley earthquakes

    Science.gov (United States)

    Haddon, E.K.; Amos, C.B.; Zielke, O.; Jayko, Angela S.; Burgmann, R.

    2016-01-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from ∼1.0 to 6.0 m and average 3.3 ± 1.1 m (2σ). Vertical offsets are predominantly east-down between ∼0.1 and 2.4 m, with a mean of 0.8 ± 0.5 m. The average lateral-to-vertical ratio compiled at specific sites is ∼6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7–11 m and net average of 4.4 ± 1.5 m, corresponding to a geologic Mw ∼7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.1 ± 2.0 m, 12.8 ± 1.5 m, and 16.6 ± 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between ∼0.6 and 1.6 mm/yr (1σ) over the late Quaternary.

  7. Surface slip during large Owens Valley earthquakes

    KAUST Repository

    Haddon, E. K.

    2016-01-10

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from approximate to 1.0 to 6.0 m and average 3.31.1 m (2 sigma). Vertical offsets are predominantly east-down between approximate to 0.1 and 2.4 m, with a mean of 0.80.5 m. The average lateral-to-vertical ratio compiled at specific sites is approximate to 6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.41.5 m, corresponding to a geologic M-w approximate to 7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.12.0 m, 12.8 +/- 1.5 m, and 16.6 +/- 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between approximate to 0.6 and 1.6 mm/yr (1 sigma) over the late Quaternary.

  8. Effects of Strike-Slip Fault Segmentation on Earthquake Energy and Seismic Hazard

    Science.gov (United States)

    Madden, E. H.; Cooke, M. L.; Savage, H. M.; McBeck, J.

    2014-12-01

    Many major strike-slip faults are segmented along strike, including those along plate boundaries in California and Turkey. Failure of distinct fault segments at depth may be the source of multiple pulses of seismic radiation observed for single earthquakes. However, how and when segmentation affects fault behavior and energy release is the basis of many outstanding questions related to the physics of faulting and seismic hazard. These include the probability for a single earthquake to rupture multiple fault segments and the effects of segmentation on earthquake magnitude, radiated seismic energy, and ground motions. Using numerical models, we quantify components of the earthquake energy budget, including the tectonic work acting externally on the system, the energy of internal rock strain, the energy required to overcome fault strength and initiate slip, the energy required to overcome frictional resistance during slip, and the radiated seismic energy. We compare the energy budgets of systems of two en echelon fault segments with various spacing that include both releasing and restraining steps. First, we allow the fault segments to fail simultaneously and capture the effects of segmentation geometry on the earthquake energy budget and on the efficiency with which applied displacement is accommodated. Assuming that higher efficiency correlates with higher probability for a single, larger earthquake, this approach has utility for assessing the seismic hazard of segmented faults. Second, we nucleate slip along a weak portion of one fault segment and let the quasi-static rupture propagate across the system. Allowing fractures to form near faults in these models shows that damage develops within releasing steps and promotes slip along the second fault, while damage develops outside of restraining steps and can prohibit slip along the second fault. Work is consumed in both the propagation of and frictional slip along these new fractures, impacting the energy available

  9. Finite element models of earthquake cycles in mature strike-slip fault zones

    Science.gov (United States)

    Lynch, John Charles

    The research presented in this dissertation is on the subject of strike-slip earthquakes and the stresses that build and release in the Earth's crust during earthquake cycles. Numerical models of these cycles in a layered elastic/viscoelastic crust are produced using the finite element method. A fault that alternately sticks and slips poses a particularly challenging problem for numerical implementation, and a new contact element dubbed the "Velcro" element was developed to address this problem (Appendix A). Additionally, the finite element code used in this study was bench-marked against analytical solutions for some simplified problems (Chapter 2), and the resolving power was tested for the fault region of the models (Appendix B). With the modeling method thus developed, there are two main questions posed. First, in Chapter 3, the effect of a finite-width shear zone is considered. By defining a viscoelastic shear zone beneath a periodically slipping fault, it is found that shear stress concentrates at the edges of the shear zone and thus causes the stress tensor to rotate into non-Andersonian orientations. Several methods are used to examine the stress patterns, including the plunge angles of the principal stresses and a new method that plots the stress tensor in a manner analogous to seismic focal mechanism diagrams. In Chapter 4, a simple San Andreas-like model is constructed, consisting of two great earthquake producing faults separated by a freely-slipping shorter fault. The model inputs of lower crustal viscosity, fault separation distance, and relative breaking strengths are examined for their effect on fault communication. It is found that with a lower crustal viscosity of 1018 Pa s (in the lower range of estimates for California), the two faults tend to synchronize their earthquake cycles, even in the cases where the faults have asymmetric breaking strengths. These models imply that postseismic stress transfer over hundreds of kilometers may play a

  10. A moment-tensor catalog for intermediate magnitude earthquakes in Mexico

    Science.gov (United States)

    Rodríguez Cardozo, Félix; Hjörleifsdóttir, Vala; Martínez-Peláez, Liliana; Franco, Sara; Iglesias Mendoza, Arturo

    2016-04-01

    is difficult. The selection of data windows and filter parameters is tedious without a tool that allows easy viewing of the data prior to the inversion. Therefore, we developed a graphical user interface (GUI), based on Python and the python library ObsPy, that processes in a iterative and interactive way observed and synthetic seismograms prior to the inversion. The processing includes filtering, choosing and discarding traces and manual adjustment of time windows in which synthetics and observed seismograms will be compared. We calculate the Green Functions using the SPECFEM3D_GLOBE algorithm (Komatitsch et al.,2004) which employs a velocity model that is composed of a mantle and a crustal model, S362ANI (Kustowski et al., 2008) and CRUST2.0 (Bassin et al., 2000), respectively. We invert the observed seismograms for the seismic moment tensor using a method developed for earthquakes in California (Liu et al., 2004) and implemented for earthquakes in Mexico (De la Vega, 2014). In this work, we introduce the GUI, the inversion method and the results from the moment-tensor inversions obtained for intermediate-magnitude earthquakes (4.5

  11. Martin Marietta Paducah Gaseous Diffusion Plant comprehensive earthquake emergency management program

    International Nuclear Information System (INIS)

    1990-01-01

    Recognizing the value of a proactive, integrated approach to earthquake preparedness planning, Martin Marietta Energy Systems, Inc. initiated a contract in June 1989 with Murray State University, Murray, Kentucky, to develop a comprehensive earthquake management program for their Gaseous Diffusion Plant in Paducah, Kentucky (PGDP -- Subcontract No. 19P-JV649V). The overall purpose of the program is to mitigate the loss of life and property in the event of a major destructive earthquake. The program includes four distinct (yet integrated) components: (1) an emergency management plan with emphasis on the catas trophic earthquake; (2) an Emergency Operations Center Duty Roster Manual; (3) an Integrated Automated Emergency Management Information System (IAEMIS); and (4) a series of five training program modules. The PLAN itself is comprised of four separate volumes: Volume I -- Chapters 1--3; Volume II -- Chapters 4--6; Volume III -- Chapter 7; and Volume IV -- 23 Appendices. The EOC Manual (which includes 15 mutual aid agreements) is designated as Chapter 7 in the PLAN and is this document numbered as Volume III

  12. The wireless networking system of Earthquake precursor mobile field observation

    Science.gov (United States)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within

  13. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    Science.gov (United States)

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul S.; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  14. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  15. Investigating on the Differences between Triggered and Background Seismicity in Italy and Southern California.

    Science.gov (United States)

    Stallone, A.; Marzocchi, W.

    2017-12-01

    Earthquake occurrence may be approximated by a multidimensional Poisson clustering process, where each point of the Poisson process is replaced by a cluster of points, the latter corresponding to the well-known aftershock sequence (triggered events). Earthquake clusters and their parents are assumed to occur according to a Poisson process at a constant temporal rate proportional to the tectonic strain rate, while events within a cluster are modeled as generations of dependent events reproduced by a branching process. Although the occurrence of such space-time clusters is a general feature in different tectonic settings, seismic sequences seem to have marked differences from region to region: one example, among many others, is that seismic sequences of moderate magnitude in Italian Apennines seem to last longer than similar seismic sequences in California. In this work we investigate on the existence of possible differences in the earthquake clustering process in these two areas. At first, we separate the triggered and background components of seismicity in the Italian and Southern California seismic catalog. Then we study the space-time domain of the triggered earthquakes with the aim to identify possible variations in the triggering properties across the two regions. In the second part of the work we focus our attention on the characteristics of the background seismicity in both seismic catalogs. The assumption of time stationarity of the background seismicity (which includes both cluster parents and isolated events) is still under debate. Some authors suggest that the independent component of seismicity could undergo transient perturbations at various time scales due to different physical mechanisms, such as, for example, viscoelastic relaxation, presence of fluids, non-stationary plate motion, etc, whose impact may depend on the tectonic setting. Here we test if the background seismicity in the two regions can be satisfactorily described by the time

  16. New Perspectives on Active Tectonics: Observing Fault Motion, Mapping Earthquake Strain Fields, and Visualizing Seismic Events in Multiple Dimensions Using Satellite Imagery and Geophysical Data Base

    Science.gov (United States)

    Crippen, R.; Blom, R.

    1994-01-01

    By rapidly alternating displays of SPOT satellite images acquired on 27 July 1991 and 25 July 1992 we are able to see spatial details of terrain movements along fault breaks associated with the 28 June 1992 Landers, California earthquake that are virtually undetectable by any other means.

  17. Coccidioidomycosis among Prison Inmates, California, USA, 2011

    Centers for Disease Control (CDC) Podcasts

    2015-02-26

    Dr. Charlotte Wheeler discusses Coccidioidomycosis among Prison Inmates in California.  Created: 2/26/2015 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 2/26/2015.

  18. California Ocean Uses Atlas: Non-Consumptive sector

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  19. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  20. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  1. Preliminary maps of Quaternary deposits and liquefaction susceptibility, nine-county San Francisco Bay region, California: a digital database

    Science.gov (United States)

    Knudsen, Keith L.; Sowers, Janet M.; Witter, Robert C.; Wentworth, Carl M.; Helley, Edward J.; Nicholson, Robert S.; Wright, Heather M.; Brown, Katherine H.

    2000-01-01

    This report presents a preliminary map and database of Quaternary deposits and liquefaction susceptibility for the nine-county San Francisco Bay region, together with a digital compendium of ground effects associated with past earthquakes in the region. The report consists of (1) a spatial database of fivedata layers (Quaternary deposits, quadrangle index, and three ground effects layers) and two text layers (a labels and leaders layer for Quaternary deposits and for ground effects), (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map, liquefaction interpretation, and the ground effects compendium, and (4) the databse description pamphlet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a modern and regionally consistent treatment of Quaternary surficial deposits that builds on the pioneering mapping of Helley and Lajoie (Helley and others, 1979) and such intervening work as Atwater (1982), Helley and others (1994), and Helley and Graymer (1997a and b). Like these earlier studies, the current mapping uses geomorphic expression, pedogenic soils, and inferred depositional environments to define and distinguish the map units. In contrast to the twelve map units of Helley and Lajoie, however, this new map uses a complex stratigraphy of some forty units, which permits a more realistic portrayal of the Quaternary depositional system. The two colored maps provide a regional summary of the new mapping at a scale of 1:275,000, a scale that is sufficient to show the general distribution and relationships of

  2. The Observation of Fault Finiteness and Rapid Velocity Variation in Pnl Waveforms for the Mw 6.5, San Simeon, California Earthquake

    Science.gov (United States)

    Konca, A. O.; Ji, C.; Helmberger, D. V.

    2004-12-01

    We observed the effect of the fault finiteness in the Pnl waveforms from regional distances (4° to 12° ) for the Mw6.5 San Simeon Earthquake on 22 December 2003. We aimed to include more of the high frequencies (2 seconds and longer periods) than the studies that use regional data for focal solutions (5 to 8 seconds and longer periods). We calculated 1-D synthetic seismograms for the Pn_l portion for both a point source, and a finite fault solution. The comparison of the point source and finite fault waveforms with data show that the first several seconds of the point source synthetics have considerably higher amplitude than the data, while finite fault does not have a similar problem. This can be explained by reversely polarized depth phases overlapping with the P waves from the later portion of the fault, and causing smaller amplitudes for the beginning portion of the seismogram. This is clearly a finite fault phenomenon; therefore, can not be explained by point source calculations. Moreover, the point source synthetics, which are calculated with a focal solution from a long period regional inversion, are overestimating the amplitude by three to four times relative to the data amplitude, while finite fault waveforms have the similar amplitudes to the data. Hence, a moment estimation based only on the point source solution of the regional data could have been wrong by half of magnitude. We have also calculated the shifts of synthetics relative to data to fit the seismograms. Our results reveal that the paths from Central California to the south are faster than to the paths to the east and north. The P wave arrival to the TUC station in Arizona is 4 seconds earlier than the predicted Southern California model, while most stations to the east are delayed around 1 second. The observed higher uppermost mantle velocities to the south are consistent with some recent tomographic models. Synthetics generated with these models significantly improves the fits and the

  3. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  4. Triggered surface slips in the Salton Trough associated with the 1999 Hector Mine, California, earthquake

    Science.gov (United States)

    Rymer, M.J.; Boatwright, J.; Seekins, L.C.; Yule, J.D.; Liu, J.

    2002-01-01

    Surface fracturing occurred along the southern San Andreas, Superstition Hills, and Imperial faults in association with the 16 October 1999 (Mw 7.1) Hector Mine earthquake, making this at least the eighth time in the past 31 years that a regional earthquake has triggered slip along faults in the Salton Trough. Fractures associated with the event formed discontinuous breaks over a 39-km-long stretch of the San Andreas fault, from the Mecca Hills southeastward to Salt Creek and Durmid Hill, a distance from the epicenter of 107 to 139 km. Sense of slip was right lateral; only locally was there a minor (~1 mm) vertical component of slip. Dextral slip ranged from 1 to 13 mm. Maximum slip values in 1999 and earlier triggered slips are most common in the central Mecca Hills. Field evidence indicates a transient opening as the Hector Mine seismic waves passed the southern San Andreas fault. Comparison of nearby strong-motion records indicates several periods of relative opening with passage of the Hector Mine seismic wave-a similar process may have contributed to the field evidence of a transient opening. Slip on the Superstition Hills fault extended at least 9 km, at a distance from the Hector Mine epicenter of about 188 to 196 km. This length of slip is a minimum value, because we saw fresh surface breakage extending farther northwest than our measurement sites. Sense of slip was right lateral; locally there was a minor (~1 mm) vertical component of slip. Dextral slip ranged from 1 to 18 mm, with the largest amounts found distributed (or skewed) away from the Hector Mine earthquake source. Slip triggered on the Superstition Hills fault commonly is skewed away from the earthquake source, most notably in 1968, 1979, and 1999. Surface slip on the Imperial fault and within the Imperial Valley extended about 22 km, representing a distance from the Hector Mine epicenter of about 204 to 226 km. Sense of slip dominantly was right lateral; the right-lateral component of slip

  5. Maps of Quaternary Deposits and Liquefaction Susceptibility in the Central San Francisco Bay Region, California

    Science.gov (United States)

    Witter, Robert C.; Knudsen, Keith L.; Sowers, Janet M.; Wentworth, Carl M.; Koehler, Richard D.; Randolph, Carolyn E.; Brooks, Suzanna K.; Gans, Kathleen D.

    2006-01-01

    This report presents a map and database of Quaternary deposits and liquefaction susceptibility for the urban core of the San Francisco Bay region. It supercedes the equivalent area of U.S. Geological Survey Open-File Report 00-444 (Knudsen and others, 2000), which covers the larger 9-county San Francisco Bay region. The report consists of (1) a spatial database, (2) two small-scale colored maps (Quaternary deposits and liquefaction susceptibility), (3) a text describing the Quaternary map and liquefaction interpretation (part 3), and (4) a text introducing the report and describing the database (part 1). All parts of the report are digital; part 1 describes the database and digital files and how to obtain them by downloading across the internet. The nine counties surrounding San Francisco Bay straddle the San Andreas fault system, which exposes the region to serious earthquake hazard (Working Group on California Earthquake Probabilities, 1999). Much of the land adjacent to the Bay and the major rivers and streams is underlain by unconsolidated deposits that are particularly vulnerable to earthquake shaking and liquefaction of water-saturated granular sediment. This new map provides a consistent detailed treatment of the central part of the 9-county region in which much of the mapping of Open-File Report 00-444 was either at smaller (less detailed) scale or represented only preliminary revision of earlier work. Like Open-File Report 00-444, the current mapping uses geomorphic expression, pedogenic soils, inferred depositional environments, and geologic age to define and distinguish the map units. Further scrutiny of the factors controlling liquefaction susceptibility has led to some changes relative to Open-File Report 00-444: particularly the reclassification of San Francisco Bay mud (Qhbm) to have only MODERATE susceptibility and the rating of artificial fills according to the Quaternary map units inferred to underlie them (other than dams - adf). The two colored

  6. NASA Applied Sciences Disasters Program Support for the September 2017 Mexico Earthquakes

    Science.gov (United States)

    Glasscoe, M. T.; Kirschbaum, D.; Torres-Perez, J. L.; Yun, S. H.; Owen, S. E.; Hua, H.; Fielding, E. J.; Liang, C.; Bekaert, D. P.; Osmanoglu, B.; Amini, R.; Green, D. S.; Murray, J. J.; Stough, T.; Struve, J. C.; Seepersad, J.; Thompson, V.

    2017-12-01

    The 8 September M 8.1 Tehuantepec and 19 September M 7.1 Puebla earthquakes were among the largest earthquakes recorded in Mexico. These two events caused widespread damage, affecting several million people and causing numerous casualties. A team of event coordinators in the NASA Applied Sciences Program activated soon after these devastating earthquakes in order to support decision makers in Mexico, using NASA modeling and international remote sensing capabilities to generate decision support products to aid in response and recovery. The NASA Disasters Program promotes the use of Earth observations to improve the prediction of, preparation for, response to, and recovery from natural and technological disasters. For these two events, the Disasters Program worked with Mexico's space agency (Agencia Espacial Mexico, AEM) and the National Center for Prevention of Disasters (Centro Nacional de Prevención de Desastres, CENAPRED) to generate products to support response, decision-making, and recovery. Products were also provided to academic partners, technical institutions, and field responders to support response. In addition, the Program partnered with the US Geological Survey (USGS), Office of Foreign Disaster Assistance (OFDA), and other partners in order to provide information to federal and domestic agencies that were supporting event response. Leveraging the expertise of investigators at NASA Centers, products such as landslide susceptibility maps, precipitation models, and radar based damage assessments and surface deformation maps were generated and used by AEM, CENAPRED, and others during the event. These were used by AEM in collaboration with other government agencies in Mexico to make appropriate decisions for mapping damage, rescue and recovery, and informing the population regarding areas prone to potential risk. We will provide an overview of the response activities and data products generated in support of the earthquake response, partnerships with

  7. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  8. Watching the Creation of Southern California's Largest Reservoir

    Science.gov (United States)

    2001-01-01

    The new Diamond Valley Lake Reservoir near the city of Hemet in Riverside County is billed as the largest earthworks construction project in U.S.history. Construction began in 1995 and involved 31 million cubic meters of foundation excavation and 84 million cubic meters of embankment construction. This set of MISR images captures the most recent phase in the reservoir's activation. At the upper left is a natural-color view acquired by the instrument's vertical-viewing (nadir) camera on March 14, 2000 (Terra orbit 1273), shortly after the Metropolitan Water District began filling the reservoir with water from the Colorado River and Northern California. Water appears darker than the surrounding land. The image at the upper right was acquired nearly one year later on March 1, 2001 (Terra orbit 6399), and shows a clear increase in the reservoir's water content. When full, the lake will hold nearly a trillion liters of water.According to the Metropolitan Water District, the 7 kilometer x 3 kilometer reservoir nearly doubles Southern California's above-groundwater storage capacity. In addition to routine water management, Diamond Valley Lake is designed to provide protection against drought and a six-month emergency supply in the event of earthquake damage to a major aqueduct. In the face of electrical power shortages, it is also expected to reduce dependence on the pumping of water from northern mountains during the high-demand summer months. An unexpected result of site excavation was the uncovering of mastodon and mammoth skeletons along with bones from extinct species not previously thought to have been indigenous to the area, such as the giant long-horned bison and North American lion. A museum and interpretive center is being built to protect these finds.The lower MISR image, from May 20, 2001 (Terra orbit 7564), is a false-color view combining data from the instrument's 26-degree forward view (displayed as blue) with data from the 26-degree backward view

  9. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  10. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  11. Wrightwood and the earthquake cycle: What a long recurrence record tells us about how faults work

    Science.gov (United States)

    Weldon, R.; Scharer, K.; Fumal, T.; Biasi, G.

    2004-01-01

    The concept of the earthquake cycle is so well established that one often hears statements in the popular media like, "the Big One is overdue" and "the longer it waits, the bigger it will be." Surprisingly, data to critically test the variability in recurrence intervals, rupture displacements, and relationships between the two are almost nonexistent. To generate a long series of earthquake intervals and offsets, we have conducted paleoseismic investigations across the San Andreas fault near the town of Wrightwood, California, excavating 45 trenches over 18 years, and can now provide some answers to basic questions about recurrence behavior of large earthquakes. To date, we have characterized at least 30 prehistoric earthquakes in a 6000-yr-long record, complete for the past 1500 yr and for the interval 3000-1500 B.C. For the past 1500 yr, the mean recurrence interval is 105 yr (31-165 yr for individual intervals) and the mean slip is 3.2 m (0.7-7 m per event). The series is slightly more ordered than random and has a notable cluster of events, during which strain was released at 3 times the long-term average rate. Slip associated with an earthquake is not well predicted by the interval preceding it, and only the largest two earthquakes appear to affect the time interval to the next earthquake. Generally, short intervals tend to coincide with large displacements and long intervals with small displacements. The most significant correlation we find is that earthquakes are more frequent following periods of net strain accumulation spanning multiple seismic cycles. The extent of paleoearthquake ruptures may be inferred by correlating event ages between different sites along the San Andreas fault. Wrightwood and other nearby sites experience rupture that could be attributed to overlap of relatively independent segments that each behave in a more regular manner. However, the data are equally consistent with a model in which the irregular behavior seen at Wrightwood

  12. Seismic response analysis of a 13-story steel moment-framed building in Alhambra, California

    Science.gov (United States)

    Rodgers, Janise E.; Sanli, Ahmet K.; Çelebi, Mehmet

    2004-01-01

    The seismic performance of steel moment-framed buildings has been of particular interest since brittle fractures were discovered at the beam-column connections of some frames following the M6.7 1994 Northridge earthquake. This report presents an investigation of the seismic behavior of an instrumented 13-story steel moment frame building located in the greater Los Angeles area of California. An extensive strong motion dataset, ambient vibration data, engineering drawings and earthquake damage reports are available for this building. The data are described and subsequently analyzed. The results of the analyses show that the building response is more complex than would be expected from its highly symmetrical geometry. The building's response is characterized by low damping in the fundamental mode, larger peak accelerations in the intermediate stories than at the roof, extended periods of vibration after the cessation of strong input shaking, beating in the response, and significant torsion during strong shaking at the top of the concrete piers which extend from the basement to the second floor. The analyses of the data and all damage detection methods employed except one method based on system identification indicate that the response of the structure was elastic in all recorded earthquakes. These findings are in general agreement with the results of intrusive inspections (meaning fireproofing and architectural finishes were removed) conducted on approximately 5 percent of the moment connections following the Northridge earthquake, which found no earthquake damage.

  13. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  14. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  15. THE MISSING EARTHQUAKES OF HUMBOLDT COUNTY: RECONCILING RECURRENCE INTERVAL ESTIMATES, SOUTHERN CASCADIA SUBDUCTION ZONE

    Science.gov (United States)

    Patton, J. R.; Leroy, T. H.

    2009-12-01

    Earthquake and tsunami hazard for northwestern California and southern Oregon is predominately based on estimates of recurrence for earthquakes on the Cascadia subduction zone and upper plate thrust faults, each with unique deformation and recurrence histories. Coastal northern California is uniquely located to enable us to distinguish these different sources of seismic hazard as the accretionary prism extends on land in this region. This region experiences ground deformation from rupture of upper plate thrust faults like the Little Salmon fault. Most of this region is thought to be above the locked zone of the megathrust, so is subject to vertical deformation during the earthquake cycle. Secondary evidence of earthquake history is found here in the form of marsh soils that coseismically subside and commonly are overlain by estuarine mud and rarely tsunami sand. It is not currently known what the source of the subsidence is for this region; it may be due to upper plate rupture, megathrust rupture, or a combination of the two. Given that many earlier investigations utilized bulk peat for 14C age determinations and that these early studies were largely reconnaissance work, these studies need to be reevaluated. Recurrence Interval estimates are inconsistent when comparing terrestrial (~500 years) and marine (~220 years) data sets. This inconsistency may be due to 1) different sources of archival bias in marine and terrestrial data sets and/or 2) different sources of deformation. Factors controlling successful archiving of paleoseismic data are considered as this relates to geologic setting and how that might change through time. We compile, evaluate, and rank existing paleoseismic data in order to prioritize future paleoseismic investigations. 14C ages are recalibrated and quality assessments are made for each age determination. We then evaluate geologic setting and prioritize important research locations and goals based on these existing data. Terrestrial core

  16. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  17. The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault

    Science.gov (United States)

    Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.

    2011-01-01

    In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.

  18. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Science.gov (United States)

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents

  19. Education for Earthquake Disaster Prevention in the Tokyo Metropolitan Area

    Science.gov (United States)

    Oki, S.; Tsuji, H.; Koketsu, K.; Yazaki, Y.

    2008-12-01

    Japan frequently suffers from all types of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. In the first half of this year, we already had three big earthquakes and heavy rainfall, which killed more than 30 people. This is not just for Japan but Asia is the most disaster-afflicted region in the world, accounting for about 90% of all those affected by disasters, and more than 50% of the total fatalities and economic losses. One of the most essential ways to reduce the damage of natural disasters is to educate the general public to let them understand what is going on during those desasters. This leads individual to make the sound decision on what to do to prevent or reduce the damage. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools, and ERI, the Earthquake Research Institute, is qualified to develop education for earthquake disaster prevention in the Tokyo metropolitan area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1703 Genroku earthquake (M 8.0) and the 1923 Kanto earthquake (M 7.9) which had 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global economic repercussion. To better understand earthquakes in this region, "Special Project for Earthquake Disaster Mitigation in Tokyo Metropolitan Area" has been conducted mainly by ERI. It is a 4-year

  20. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.