WorldWideScience

Sample records for california earthquake center

  1. Building the Southern California Earthquake Center

    Science.gov (United States)

    Jordan, T. H.; Henyey, T.; McRaney, J. K.

    2004-12-01

    Kei Aki was the founding director of the Southern California Earthquake Center (SCEC), a multi-institutional collaboration formed in 1991 as a Science and Technology Center (STC) under the National Science Foundation (NSF) and the U. S. Geological Survey (USGS). Aki and his colleagues articulated a system-level vision for the Center: investigations by disciplinary working groups would be woven together into a "Master Model" for Southern California. In this presentation, we will outline how the Master-Model concept has evolved and how SCEC's structure has adapted to meet scientific challenges of system-level earthquake science. In its first decade, SCEC conducted two regional imaging experiments (LARSE I & II); published the "Phase-N" reports on (1) the Landers earthquake, (2) a new earthquake rupture forecast for Southern California, and (3) new models for seismic attenuation and site effects; it developed two prototype "Community Models" (the Crustal Motion Map and Community Velocity Model) and, perhaps most important, sustained a long-term, multi-institutional, interdisciplinary collaboration. The latter fostered pioneering numerical simulations of earthquake ruptures, fault interactions, and wave propagation. These accomplishments provided the impetus for a successful proposal in 2000 to reestablish SCEC as a "stand alone" center under NSF/USGS auspices. SCEC remains consistent with the founders' vision: it continues to advance seismic hazard analysis through a system-level synthesis that is based on community models and an ever expanding array of information technology. SCEC now represents a fully articulated "collaboratory" for earthquake science, and many of its features are extensible to other active-fault systems and other system-level collaborations. We will discuss the implications of the SCEC experience for EarthScope, the USGS's program in seismic hazard analysis, NSF's nascent Cyberinfrastructure Initiative, and other large collaboratory programs.

  2. Southern California Earthquake Center (SCEC) Summer Internship Programs

    Science.gov (United States)

    Benthien, M. L.; Perry, S.; Jordan, T. H.

    2004-12-01

    For the eleventh consecutive year, the Southern California Earthquake Center (SCEC) coordinated undergraduate research experiences in summer 2004, allowing 35 students with a broad array of backgrounds and interests to work with the world's preeminent earthquake scientists and specialists. Students participate in interdisciplinary, system-level earthquake science and information technology research, and several group activities throughout the summer. Funding for student stipends and activities is made possible by the NSF Research Experiences for Undergraduates (REU) program. SCEC coordinates two intern programs: The SCEC Summer Undergraduate Research Experience (SCEC/SURE) and the SCEC Undergraduate Summer in Earthquake Information Technology (SCEC/USEIT). SCEC/SURE interns work one-on-one with SCEC scientists at their institutions on a variety of earthquake science research projects. The goals of the program are to expand student participation in the earth sciences and related disciplines, encourage students to consider careers in research and education, and to increase diversity of students and researchers in the earth sciences. 13 students participated in this program in 2004. SCEC/USEIT is an NSF REU site that brings undergraduate students from across the country to the University of Southern California each summer. SCEC/USEIT interns interact in a team-oriented research environment and are mentored by some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are to allow undergraduates to use advanced tools of information technology to solve problems in earthquake research; close the gap between computer science and geoscience; and engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk. SCEC/USEIT summer research goals are structured around a grand challenge problem in earthquake information technology. For the past three years the students have

  3. Northern California Earthquake Data Center: Data Sets and Data Services

    Science.gov (United States)

    Neuhauser, D. S.; Allen, R. M.; Zuzlewski, S.

    2015-12-01

    The Northern California Earthquake Data Center (NCEDC) provides a permanent archive and real-time data distribution services for a unique and comprehensive data set of seismological and geophysical data sets encompassing northern and central California. We provide access to over 85 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 900,000 events from 1984 to the present, and the NCEDC serves catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also serve event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a several ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  4. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    Science.gov (United States)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and

  5. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  6. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    Science.gov (United States)

    Yu, E.; Bhaskaran, A.; Chen, S. L.; Andrews, J. R.; Thomas, V. I.; Hauksson, E.; Clayton, R. W.

    2016-12-01

    The Southern California Earthquake Data Center (SCEDC) archives continuous and triggered data from nearly 9429 data channels from 513 Southern California Seismic Network recorded stations. The SCEDC provides public access to these earthquake parametric and waveform data through web services, its website http://scedc.caltech.edu and through client application such as STP. This poster will describe the most recent significant developments at the SCEDC. The SCEDC now provides web services to access its holdings. Event Parametric Data (FDSN Compliant): http://service.scedc.caltech.edu/fdsnws/event/1/ Station Metadata (FDSN Compliant): http://service.scedc.caltech.edu/fdsnws/station/1/ Waveforms (FDSN Compliant): http://service.scedc.caltech.edu/fdsnws/dataselect/1/ Event Windowed Waveforms, phases: http://service.scedc.caltech.edu/webstp/ In an effort to assist researchers accessing catalogs from multiple seismic networks, the SCEDC has entered its earthquake parametric catalog into the ANSS Common Catalog (ComCat). Origin, phase, and magnitude information have been loaded. The SCEDC data holdings now include a double difference catalog (Hauksson et. al 2011) spanning 1981 through 2015 available via STP, and a focal mechanism catalog (Yang et al. 2011). As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC now archives and distributes real time 1 Hz streams of GPS displacement solutions from the California Real Time Network. The SCEDC has implemented the Continuous Wave Buffer (CWB) to manage its waveform archive and allow users to access continuous data available within seconds of real time. This software was developed and currently in use at NEIC. SCEDC has moved its website (http://scedc.caltech.edu) to the Cloud. The Recent Earthquake Map and static web pages are now hosted by Amazon Web Services. This enables the web site to serve large number of users without competing for resources needed by SCSN/SCEDC mission critical operations.

  7. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest

  8. Data Sets and Data Services at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  9. Data and Visualizations in the Southern California Earthquake Center's Fault Information System

    Science.gov (United States)

    Perry, S.

    2003-12-01

    The Southern California Earthquake Center's Fault Information System (FIS) provides a single point of access to fault-related data and models from multiple databases and datasets. The FIS is built of computer code, metadata and Web interfaces based on Web services technology, which enables queries and data interchange irrespective of computer software or platform. Currently we have working prototypes of programmatic and browser-based access. The first generation FIS may be searched and downloaded live, by automated processes, as well as interactively, by humans using a browser. Users get ascii data in plain text or encoded in XML. Via the Earthquake Information Technology (EIT) Interns (Juve and others, this meeting), we are also testing the effectiveness of querying multiple databases using a fault database ontology. For more than a decade, the California Geological Survey (CGS), SCEC, and the U. S. Geological Survey (USGS) have put considerable, shared resources into compiling and assessing published fault data, then providing the data on the Web. Several databases now exist, with different formats, datasets, purposes, and users, in various stages of completion. When fault databases were first envisioned, the full power of today's internet was not yet recognized, and the databases became the Web equivalents of review papers, where one could read an overview summation of a fault, then copy and paste pertinent data. Today, numerous researchers also require rapid queries and downloads of data. Consequently, the first components of the FIS are MySQL databases that deliver numeric values from earlier, text-based databases. Another essential service provided by the FIS is visualizations of fault representations such as those in SCEC's Community Fault Model. The long term goal is to provide a standardized, open-source, platform-independent visualization technique. Currently, the FIS makes available fault model viewing software for users with access to Matlab or Java3D

  10. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    Science.gov (United States)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  11. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    Science.gov (United States)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ● The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ● The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ● The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  12. Earthquake education in California

    Science.gov (United States)

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  13. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    Science.gov (United States)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  14. Earthquakes and faults in southern California (1970-2010)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  15. The magnitude distribution of earthquakes near Southern California faults

    Science.gov (United States)

    Page, M.T.; Alderson, D.; Doyle, J.

    2011-01-01

    We investigate seismicity near faults in the Southern California Earthquake Center Community Fault Model. We search for anomalously large events that might be signs of a characteristic earthquake distribution. We find that seismicity near major fault zones in Southern California is well modeled by a Gutenberg-Richter distribution, with no evidence of characteristic earthquakes within the resolution limits of the modern instrumental catalog. However, the b value of the locally observed magnitude distribution is found to depend on distance to the nearest mapped fault segment, which suggests that earthquakes nucleating near major faults are likely to have larger magnitudes relative to earthquakes nucleating far from major faults. Copyright 2011 by the American Geophysical Union.

  16. Prospective Tests of Southern California Earthquake Forecasts

    Science.gov (United States)

    Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.; Kagan, Y. Y.; Helmstetter, A.; Wiemer, S.; Field, N.

    2004-12-01

    We are testing earthquake forecast models prospectively using likelihood ratios. Several investigators have developed such models as part of the Southern California Earthquake Center's project called Regional Earthquake Likelihood Models (RELM). Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. Here we describe the testing procedure and present preliminary results. Forecasts are expressed as the yearly rate of earthquakes within pre-specified bins of longitude, latitude, magnitude, and focal mechanism parameters. We test models against each other in pairs, which requires that both forecasts in a pair be defined over the same set of bins. For this reason we specify a standard "menu" of bins and ground rules to guide forecasters in using common descriptions. One menu category includes five-year forecasts of magnitude 5.0 and larger. Contributors will be requested to submit forecasts in the form of a vector of yearly earthquake rates on a 0.1 degree grid at the beginning of the test. Focal mechanism forecasts, when available, are also archived and used in the tests. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.1 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. Tests are based on the log likelihood scores derived from the probability that future earthquakes would occur where they do if a given forecast were true [Kagan and Jackson, J. Geophys. Res.,100, 3,943-3,959, 1995]. For each pair of forecasts, we compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability

  17. USGS California Water Science Center water programs in California

    Science.gov (United States)

    Shulters, Michael V.

    2005-01-01

    California is threatened by many natural hazards—fire, floods, landslides, earthquakes. The State is also threatened by longer-term problems, such as hydrologic effects of climate change, and human-induced problems, such as overuse of ground water and degradation of water quality. The threats and problems are intensified by increases in population, which has risen to nearly 36.8 million. For the USGS California Water Science Center, providing scientific information to help address hazards, threats, and hydrologic issues is a top priority. To meet the demands of a growing California, USGS scientific investigations are helping State and local governments improve emergency management, optimize resources, collect contaminant-source and -mobility information, and improve surface- and ground-water quality. USGS hydrologic studies and data collection throughout the State give water managers quantifiable and detailed scientific information that can be used to plan for development and to protect and more efficiently manage resources. The USGS, in cooperation with state, local, and tribal agencies, operates more than 500 instrument stations, which monitor streamflow, ground-water levels, and surface- and ground-water constituents to help protect water supplies and predict the threats of natural hazards. The following are some of the programs implemented by the USGS, in cooperation with other agencies, to obtain and analyze information needed to preserve California's environment and resources.

  18. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    Science.gov (United States)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  19. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  20. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  1. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    Science.gov (United States)

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  2. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  3. Post-Earthquake Traffic Capacity of Modern Bridges in California

    Science.gov (United States)

    2010-03-01

    Evaluation of the capacity of a bridge to carry self-weight and traffic loads after an earthquake is essential for a : safe and timely re-opening of the bridge. In California, modern highway bridges designed using the Caltrans : Seismic Design Criter...

  4. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  5. Deterministic Earthquake Hazard Assessment by Public Agencies in California

    Science.gov (United States)

    Mualchin, L.

    2005-12-01

    Even in its short recorded history, California has experienced a number of damaging earthquakes that have resulted in new codes and other legislation for public safety. In particular, the 1971 San Fernando earthquake produced some of the most lasting results such as the Hospital Safety Act, the Strong Motion Instrumentation Program, the Alquist-Priolo Special Studies Zone Act, and the California Department of Transportation (Caltrans') fault-based deterministic seismic hazard (DSH) map. The latter product provides values for earthquake ground motions based on Maximum Credible Earthquakes (MCEs), defined as the largest earthquakes that can reasonably be expected on faults in the current tectonic regime. For surface fault rupture displacement hazards, detailed study of the same faults apply. Originally, hospital, dam, and other critical facilities used seismic design criteria based on deterministic seismic hazard analyses (DSHA). However, probabilistic methods grew and took hold by introducing earthquake design criteria based on time factors and quantifying "uncertainties", by procedures such as logic trees. These probabilistic seismic hazard analyses (PSHA) ignored the DSH approach. Some agencies were influenced to adopt only the PSHA method. However, deficiencies in the PSHA method are becoming recognized, and the use of the method is now becoming a focus of strong debate. Caltrans is in the process of producing the fourth edition of its DSH map. The reason for preferring the DSH method is that Caltrans believes it is more realistic than the probabilistic method for assessing earthquake hazards that may affect critical facilities, and is the best available method for insuring public safety. Its time-invariant values help to produce robust design criteria that are soundly based on physical evidence. And it is the method for which there is the least opportunity for unwelcome surprises.

  6. California Earthquake Clearinghouse Activation for August 24, 2014, M6.0 South Napa Earthquake

    Science.gov (United States)

    Rosinski, A.; Parrish, J.; Mccrink, T. P.; Tremayne, H.; Ortiz, M.; Greene, M.; Berger, J.; Blair, J. L.; Johnson, M.; Miller, K.; Seigel, J.; Long, K.; Turner, F.

    2014-12-01

    The Clearinghouse's principal functions are to 1) coordinate field investigations of earth scientists, engineers, and other participating researchers; 2) facilitate sharing of observations through regular meetings and through the Clearinghouse website; and 3) notify disaster responders of crucial observations or results. Shortly after 3:20 a.m., on August 24, 2014, Clearinghouse management committee organizations, the California Geological Survey (CGS), the Earthquake Engineering Research Institute (EERI), the United States Geological Survey (USGS), the California Office of Emergency Services (CalOES), and the California Seismic Safety Commission (CSSC), authorized activation of a virtual Clearinghouse and a physical Clearinghouse location. The California Geological Survey, which serves as the permanent, lead coordination organization for the Clearinghouse, provided all coordination with the state for all resources required for Clearinghouse activation. The Clearinghouse physical location, including mobile satellite communications truck, was opened at a Caltrans maintenance facility located at 3161 Jefferson Street, in Napa. This location remained active through August 26, 2014, during which time it drew the participation of over 100 experts from more than 40 different organizations, and over 1730 remote visitors via the Virtual Clearinghouse and online data compilation map. The Clearinghouse conducted three briefing calls each day with the State Operations Center (SOC) and Clearinghouse partners, and also conducted nightly briefings, accessible to remote participants via webex, with field personnel. Data collected by field researchers was compiled into a map through the efforts of EERI and USGS volunteers in the Napa Clearinghouse. EERI personnel continued to provide updates to the compilation map over an extended period of time following de-activation of the Clearinghouse. In addition, EERI managed the Clearinghouse website. Two overflights were conducted, for

  7. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    Science.gov (United States)

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these

  8. Self-potential variations preceding earthquakes in central california

    International Nuclear Information System (INIS)

    Corwin, R.F.; Morrison, H.G.

    1977-01-01

    Two earthquakes in central California were preceded by anomalous variations in the horizontal electric field (self-potential) of the earth. The first variation was an anomaly of 90 mV amplitude across electrode dipoles of 630 and 640 m, which began 55 days before an earthquake of M=5, located 37 km NW of the dipoles. The second variation had an amplitude of 4 mV across a 300 m dipole, and began 110 hours before an event of M=2.4 located on the San Andreas fault, 2.5 km from the dipole. Streaming potentials generated by the flow of groundwater into a dilatant zone are proposed as a possible mechanism for the observed variations

  9. Earthquake epicenters and fault intersections in central and southern California

    Science.gov (United States)

    Abdel-Gawad, M. (Principal Investigator); Silverstein, J.

    1972-01-01

    The author has identifed the following significant results. ERTS-1 imagery provided evidence for the existence of short transverse fault segments lodged between faults of the San Andreas system in the Coast Ranges, California. They indicate that an early episode of transverse shear has affected the Coast Ranges prior to the establishment of the present San Andreas fault. The fault has been offset by transverse faults of the Transverse Ranges. It appears feasible to identify from ERTS-1 imagery geomorphic criteria of recent fault movements. Plots of historic earthquakes in the Coast Ranges and western Transverse Ranges show clusters in areas where structures are complicated by interaction of tow active fault systems. A fault lineament apparently not previously mapped was identified in the Uinta Mountains, Utah. Part of the lineament show evidence of recent faulting which corresponds to a moderate earthquake cluster.

  10. Earthquakes

    Science.gov (United States)

    ... Centers Evacuation Center Play Areas Animals in Public Evacuation Centers Pet Shelters Interim Guidelines for Animal Health and Control of Disease Transmission in Pet Shelters Protect Your Pets Earthquakes Language: English (US) Español (Spanish) Recommend on Facebook ...

  11. UCERF3: A new earthquake forecast for California's complex fault system

    Science.gov (United States)

    Field, Edward H.; ,

    2015-01-01

    With innovations, fresh data, and lessons learned from recent earthquakes, scientists have developed a new earthquake forecast model for California, a region under constant threat from potentially damaging events. The new model, referred to as the third Uniform California Earthquake Rupture Forecast, or "UCERF" (http://www.WGCEP.org/UCERF3), provides authoritative estimates of the magnitude, location, and likelihood of earthquake fault rupture throughout the state. Overall the results confirm previous findings, but with some significant changes because of model improvements. For example, compared to the previous forecast (Uniform California Earthquake Rupture Forecast 2), the likelihood of moderate-sized earthquakes (magnitude 6.5 to 7.5) is lower, whereas that of larger events is higher. This is because of the inclusion of multifault ruptures, where earthquakes are no longer confined to separate, individual faults, but can occasionally rupture multiple faults simultaneously. The public-safety implications of this and other model improvements depend on several factors, including site location and type of structure (for example, family dwelling compared to a long-span bridge). Building codes, earthquake insurance products, emergency plans, and other risk-mitigation efforts will be updated accordingly. This model also serves as a reminder that damaging earthquakes are inevitable for California. Fortunately, there are many simple steps residents can take to protect lives and property.

  12. Great East Japan earthquake, JR East mitigation successes, and lessons for California high-speed rail.

    Science.gov (United States)

    2015-04-01

    California and Japan both experience frequent seismic activity, which is often damaging to infrastructure. Seismologists have : developed systems for detecting and analyzing earthquakes in real-time. JR East has developed systems to mitigate the : da...

  13. Contrasted fossil spreading centers off Baja California

    Science.gov (United States)

    Dyment, J.; Michaud, F.; Royer, J. Y.; Bourgois, J.; Sichler, B.; Bandy, W.; Mortera, C.; Sosson, M.; Pontoise, B.; Calmus, T.

    2003-04-01

    In April 2002, R/V L Atalante collected swath-bathymetry, surface and deep tow magnetic, gravity and seismic data in order to investigate the existence, characteristics and age of the Guadalupe and Magdalena fossil spreading centers that were postulated off Baja California (eastern Pacific Ocean). The new data confirm the existence of these extinct spreading centers and better define the location and orientation of the Magdalena Ridge segments. The two fossil ridges exhibit very different characters. The Guadalupe fossil axis displays a deep N-S axial valley with a 2D geometry, and regular abyssal hills and magnetic anomalies on its flanks. According to surface and deep tow magnetics, seafloor spreading stopped at 12 Ma (anomaly 5A). Conversely, the Magdalena fossil spreading system exhibits a complex bathymetric structure, with a series of ridge segments and conjugate fan-shaped abyssal hills, troughs and volcanic highs, and spreading discontinuities with various orientation. The surface and deep-tow magnetics indicate an age younger than or equal to 12 Ma, 5A being the youngest unambiguously identified magnetic anomaly. The morphological and structural difference between the two fossil spreading centers is striking. We interpret the fan-shaped abyssal hills and the various structural direction of the Magdalena spreading system as the result of a continuous clockwise change in spreading direction of about 18deg./Ma, for a total of 45deg. between anomalies 5B and 5A. Spreading finally ceased when the seafloor spreading direction became parallel to the margin. We believe that then, a new strike-slip plate boundary initiated along the western margin of Baja California. The Guadalupe ridge gradually slowed down with a minor 10deg. reorientation prior to extinction at chron 5A. This observation suggests that a Magdalena plate and a Guadalupe plate started to behave independently at about 14.5 Ma, with the Shirley FZ (27.6N) acting as a plate boundary. Whether there

  14. Tilt Precursors before Earthquakes on the San Andreas Fault, California.

    Science.gov (United States)

    Johnston, M J; Mortensen, C E

    1974-12-13

    An array of 14 biaxial shallow-borehole tiltmeters (at 1O(-7) radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (> 10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  15. The USGS National Earthquake Information Center's Response to the Wenchuan, China Earthquake

    Science.gov (United States)

    Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.

    2008-12-01

    Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps

  16. Earthquake potential in California-Nevada implied by correlation of strain rate and seismicity

    Science.gov (United States)

    Zeng, Yuehua; Petersen, Mark D.; Shen, Zheng-Kang

    2018-01-01

    Rock mechanics studies and dynamic earthquake simulations show that patterns of seismicity evolve with time through (1) accumulation phase, (2) localization phase, and (3) rupture phase. We observe a similar pattern of changes in seismicity during the past century across California and Nevada. To quantify these changes, we correlate GPS strain rates with seismicity. Earthquakes of M > 6.5 are collocated with regions of highest strain rates. By contrast, smaller magnitude earthquakes of M ≥ 4 show clear spatiotemporal changes. From 1933 to the late 1980s, earthquakes of M ≥ 4 were more diffused and broadly distributed in both high and low strain rate regions (accumulation phase). From the late 1980s to 2016, earthquakes were more concentrated within the high strain rate areas focused on the major fault strands (localization phase). In the same time period, the rate of M > 6.5 events also increased significantly in the high strain rate areas. The strong correlation between current strain rate and the later period of seismicity indicates that seismicity is closely related to the strain rate. The spatial patterns suggest that before the late 1980s, the strain rate field was also broadly distributed because of the stress shadows from previous large earthquakes. As the deformation field evolved out of the shadow in the late 1980s, strain has refocused on the major fault systems and we are entering a period of increased risk for large earthquakes in California.

  17. The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami

    Science.gov (United States)

    Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.

    2011-12-01

    The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of

  18. Keeping the History in Historical Seismology: The 1872 Owens Valley, California Earthquake

    International Nuclear Information System (INIS)

    Hough, Susan E.

    2008-01-01

    The importance of historical earthquakes is being increasingly recognized. Careful investigations of key pre-instrumental earthquakes can provide critical information and insights for not only seismic hazard assessment but also for earthquake science. In recent years, with the explosive growth in computational sophistication in Earth sciences, researchers have developed increasingly sophisticated methods to analyze macroseismic data quantitatively. These methodological developments can be extremely useful to exploit fully the temporally and spatially rich information source that seismic intensities often represent. For example, the exhaustive and painstaking investigations done by Ambraseys and his colleagues of early Himalayan earthquakes provides information that can be used to map out site response in the Ganges basin. In any investigation of macroseismic data, however, one must stay mindful that intensity values are not data but rather interpretations. The results of any subsequent analysis, regardless of the degree of sophistication of the methodology, will be only as reliable as the interpretations of available accounts - and only as complete as the research done to ferret out, and in many cases translate, these accounts. When intensities are assigned without an appreciation of historical setting and context, seemingly careful subsequent analysis can yield grossly inaccurate results. As a case study, I report here on the results of a recent investigation of the 1872 Owen's Valley, California earthquake. Careful consideration of macroseismic observations reveals that this event was probably larger than the great San Francisco earthquake of 1906, and possibly the largest historical earthquake in California. The results suggest that some large earthquakes in California will generate significantly larger ground motions than San Andreas fault events of comparable magnitude

  19. Triggered surface slips in southern California associated with the 2010 El Mayor-Cucapah, Baja California, Mexico, earthquake

    Science.gov (United States)

    Rymer, Michael J.; Treiman, Jerome A.; Kendrick, Katherine J.; Lienkaemper, James J.; Weldon, Ray J.; Bilham, Roger; Wei, Meng; Fielding, Eric J.; Hernandez, Janis L.; Olson, Brian P.E.; Irvine, Pamela J.; Knepprath, Nichole; Sickler, Robert R.; Tong, Xiaopeng; Siem, Martin E.

    2011-01-01

    The April 4, 2010 (Mw7.2), El Mayor-Cucapah, Baja California, Mexico, earthquake is the strongest earthquake to shake the Salton Trough area since the 1992 (Mw7.3) Landers earthquake. Similar to the Landers event, ground-surface fracturing occurred on multiple faults in the trough. However, the 2010 event triggered surface slip on more faults in the central Salton Trough than previous earthquakes, including multiple faults in the Yuha Desert area, the southwestern section of the Salton Trough. In the central Salton Trough, surface fracturing occurred along the southern San Andreas, Coyote Creek, Superstition Hills, Wienert, Kalin, and Imperial Faults and along the Brawley Fault Zone, all of which are known to have slipped in historical time, either in primary (tectonic) slip and/or in triggered slip. Surface slip in association with the El Mayor-Cucapah earthquake is at least the eighth time in the past 42 years that a local or regional earthquake has triggered slip along faults in the central Salton Trough. In the southwestern part of the Salton Trough, surface fractures (triggered slip) occurred in a broad area of the Yuha Desert. This is the first time that triggered slip has been observed in the southwestern Salton Trough.

  20. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  1. Offshore Earthquake 2012 in the California Borderlands: Possible Extension of Seismically Active Area Offshore

    Science.gov (United States)

    Blackburn, A.; Kelly, T. B.; Coakley, B.; Pockalny, R. A.

    2017-12-01

    The California Borderlands is a tectonically active region with abundant seismic activity. In 2012, an earthquake epicenter was located on the eastern Pacific Plate west of the Patton Escarpment (31.08N, 119.61W). The earthquake was a magnitude 6.3 with a normal focal mechanism. In the past, seismic activity was thought not to extend past the Patton Escarpment. With this offshore earthquake, the extent of seismically active structures past the Patton Escarpment has been brought into question. On a recent Marine Geology and Geophysics Chief Scientists Training Cruise, early career scientists worked together to develop projects that could be completed aboard the RV Sikuliaq. A survey was completed of the earthquake area collecting gravity, multibeam, and sub-bottom profiler data. The survey was designed to identify seafloor morphology or internal structure that could have localized the unexpected offshore seismic activity. Possible mechanisms for the earthquake are a structure linked to the fault system with in the California Borderlands that was extinct but reactivated. Another possibility is that a structure in the oceanic lithosphere may have reactivated to accommodate movement. Establishing the possible mechanism of the 2012 earthquake can determine the possibility of other seismic activity offshore the Borderlands.

  2. Liquefaction-induced lateral spreading in Oceano, California, during the 2003 San Simeon Earthquake

    Science.gov (United States)

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.; Di Alessandro, Carola; Boatwright, John; Tinsley, John C.; Sell, Russell W.; Rosenberg, Lewis I.

    2004-01-01

    The December 22, 2003, San Simeon, California, (M6.5) earthquake caused damage to houses, road surfaces, and underground utilities in Oceano, California. The community of Oceano is approximately 50 miles (80 km) from the earthquake epicenter. Damage at this distance from a M6.5 earthquake is unusual. To understand the causes of this damage, the U.S. Geological Survey conducted extensive subsurface exploration and monitoring of aftershocks in the months after the earthquake. The investigation included 37 seismic cone penetration tests, 5 soil borings, and aftershock monitoring from January 28 to March 7, 2004. The USGS investigation identified two earthquake hazards in Oceano that explain the San Simeon earthquake damage?site amplification and liquefaction. Site amplification is a phenomenon observed in many earthquakes where the strength of the shaking increases abnormally in areas where the seismic-wave velocity of shallow geologic layers is low. As a result, earthquake shaking is felt more strongly than in surrounding areas without similar geologic conditions. Site amplification in Oceano is indicated by the physical properties of the geologic layers beneath Oceano and was confirmed by monitoring aftershocks. Liquefaction, which is also commonly observed during earthquakes, is a phenomenon where saturated sands lose their strength during an earthquake and become fluid-like and mobile. As a result, the ground may undergo large permanent displacements that can damage underground utilities and well-built surface structures. The type of displacement of major concern associated with liquefaction is lateral spreading because it involves displacement of large blocks of ground down gentle slopes or towards stream channels. The USGS investigation indicates that the shallow geologic units beneath Oceano are very susceptible to liquefaction. They include young sand dunes and clean sandy artificial fill that was used to bury and convert marshes into developable lots. Most of

  3. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    Science.gov (United States)

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  4. Responses of a tall building in Los Angeles, California as inferred from local and distant earthquakes

    Science.gov (United States)

    Çelebi, Mehmet; Hasan Ulusoy,; Nori Nakata,

    2016-01-01

    Increasing inventory of tall buildings in the United States and elsewhere may be subjected to motions generated by near and far seismic sources that cause long-period effects. Multiple sets of records that exhibited such effects were retrieved from tall buildings in Tokyo and Osaka ~ 350 km and 770 km from the epicenter of the 2011 Tohoku earthquake. In California, very few tall buildings have been instrumented. An instrumented 52-story building in downtown Los Angeles recorded seven local and distant earthquakes. Spectral and system identification methods exhibit significant low frequencies of interest (~0.17 Hz, 0.56 Hz and 1.05 Hz). These frequencies compare well with those computed by transfer functions; however, small variations are observed between the significant low frequencies for each of the seven earthquakes. The torsional and translational frequencies are very close and are coupled. Beating effect is observed in at least two of the seven earthquake data.

  5. Earthquake Swarm Along the San Andreas Fault near Palmdale, Southern California, 1976 to 1977.

    Science.gov (United States)

    McNally, K C; Kanamori, H; Pechmann, J C; Fuis, G

    1978-09-01

    Between November 1976 and November 1977 a swarm of small earthquakes (local magnitude San Andreas fault near Palmdale, California. This swarm was the first observed along this section of the San Andreas since cataloging of instrumental data began in 1932. The activity followed partial subsidence of the 35-centimeter vertical crustal uplift known as the Palmdale bulge along this "locked" section of the San Andreas, which last broke in the great (surface-wave magnitude = 8(1/4)+) 1857 Fort Tejon earthquake. The swarm events exhibit characteristics previously observed for some foreshock sequences, such as tight clustering of hypocenters and time-dependent rotations of stress axes inferred from focal mechanisms. However, because of our present lack of understanding of the processes that precede earthquake faulting, the implications of the swarm for future large earthquakes on the San Andreas fault are unknown.

  6. Injuries and Traumatic Psychological Exposures Associated with the South Napa Earthquake - California, 2014.

    Science.gov (United States)

    Attfield, Kathleen R; Dobson, Christine B; Henn, Jennifer B; Acosta, Meileen; Smorodinsky, Svetlana; Wilken, Jason A; Barreau, Tracy; Schreiber, Merritt; Windham, Gayle C; Materna, Barbara L; Roisman, Rachel

    2015-09-11

    On August 24, 2014, at 3:20 a.m., a magnitude 6.0 earthquake struck California, with its epicenter in Napa County (1). The earthquake was the largest to affect the San Francisco Bay area in 25 years and caused significant damage in Napa and Solano counties, including widespread power outages, five residential fires, and damage to roadways, waterlines, and 1,600 buildings (2). Two deaths resulted (2). On August 25, Napa County Public Health asked the California Department of Public Health (CDPH) for assistance in assessing postdisaster health effects, including earthquake-related injuries and effects on mental health. On September 23, Solano County Public Health requested similar assistance. A household-level Community Assessment for Public Health Emergency Response (CASPER) was conducted for these counties in two cities (Napa, 3 weeks after the earthquake, and Vallejo, 6 weeks after the earthquake). Among households reporting injuries, a substantial proportion (48% in Napa and 37% in western Vallejo) reported that the injuries occurred during the cleanup period, suggesting that increased messaging on safety precautions after a disaster might be needed. One fifth of respondents overall (27% in Napa and 9% in western Vallejo) reported one or more traumatic psychological exposures in their households. These findings were used by Napa County Mental Health to guide immediate-term mental health resource allocations and to conduct public training sessions and education campaigns to support persons with mental health risks following the earthquake. In addition, to promote community resilience and future earthquake preparedness, Napa County Public Health subsequently conducted community events on the earthquake anniversary and provided outreach workers with psychological first aid training.

  7. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  8. On the reported ionospheric precursor of the Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, J.N.; Love, J.J.; Komjathy, A.; Verkhoglyadova, O.P.; Butala, M.; Rivera, N.

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  9. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    Science.gov (United States)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published

  10. The Magnitude Distribution of Earthquakes Near Southern California Faults

    Science.gov (United States)

    2011-12-16

    Lindh , 1985; Jackson and Kagan, 2006]. We do not consider time dependence in this study, but focus instead on the magnitude distribution for this fault...90032-7. Bakun, W. H., and A. G. Lindh (1985), The Parkfield, California, earth- quake prediction experiment, Science, 229(4714), 619–624, doi:10.1126

  11. [Engineering aspects of seismic behavior of health-care facilities: lessons from California earthquakes].

    Science.gov (United States)

    Rutenberg, A

    1995-03-15

    The construction of health-care facilities is similar to that of other buildings. Yet the need to function immediately after an earthquake, the helplessness of the many patients and the high and continuous occupancy of these buildings, require that special attention be paid to their seismic performance. Here the lessons from the California experience are invaluable. In this paper the behavior of California hospitals during destructive earthquakes is briefly described. Adequate structural design and execution, and securing of nonstructural elements are required to ensure both safety of occupants, and practically uninterrupted functioning of equipment, mechanical and electrical services and other vital systems. Criteria for post-earthquake functioning are listed. In view of the hazards to Israeli hospitals, in particular those located along the Jordan Valley and the Arava, a program for the seismic evaluation of medical facilities should be initiated. This evaluation should consider the hazards from nonstructural elements, the safety of equipment and systems, and their ability to function after a severe earthquake. It should not merely concentrate on safety-related structural behavior.

  12. Liquefaction and other ground failures in Imperial County, California, from the April 4, 2010, El Mayor-Cucapah earthquake

    Science.gov (United States)

    McCrink, Timothy P.; Pridmore, Cynthia L.; Tinsley, John C.; Sickler, Robert R.; Brandenberg, Scott J.; Stewart, Jonathan P.

    2011-01-01

    The Colorado River Delta region of southern Imperial Valley, California, and Mexicali Valley, Baja California, is a tectonically dynamic area characterized by numerous active faults and frequent large seismic events. Significant earthquakes that have been accompanied by surface fault rupture and/or soil liquefaction occurred in this region in 1892 (M7.1), 1915 (M6.3; M7.1), 1930 (M5.7), 1940 (M6.9), 1950 (M5.4), 1957 (M5.2), 1968 (6.5), 1979 (6.4), 1980 (M6.1), 1981 (M5.8), and 1987 (M6.2; M6.8). Following this trend, the M7.2 El Mayor-Cucapah earthquake of April 4, 2010, ruptured approximately 120 kilometers along several known faults in Baja California. Liquefaction caused by the M7.2 El Mayor-Cucapah earthquake was widespread throughout the southern Imperial Valley but concentrated in the southwest corner of the valley, southwest of the city centers of Calexico and El Centro where ground motions were highest. Although there are few strong motion recordings in the very western part of the area, the recordings that do exist indicate that ground motions were on the order of 0.3 to 0.6g where the majority of liquefaction occurrences were found. More distant liquefaction occurrences, at Fites Road southwest of Brawley and along Rosita Canal northwest of Holtville were triggered where ground motions were about 0.2 g. Damage to roads was associated mainly with liquefaction of sandy river deposits beneath bridge approach fills, and in some cases liquefaction within the fills. Liquefaction damage to canal and drain levees was not always accompanied by vented sand, but the nature of the damage leads the authors to infer that liquefaction was involved in the majority of observed cases. Liquefaction-related damage to several public facilities - Calexico Waste Water Treatment Plant, Fig Lagoon levee system, and Sunbeam Lake Dam in particular - appears to be extensive. The cost to repair these facilities to prevent future liquefaction damage will likely be prohibitive. As

  13. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    Science.gov (United States)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  14. National Earthquake Information Center Seismic Event Detections on Multiple Scales

    Science.gov (United States)

    Patton, J.; Yeck, W. L.; Benz, H.; Earle, P. S.; Soto-Cordero, L.; Johnson, C. E.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (NEIC) monitors seismicity on local, regional, and global scales using automatic picks from more than 2,000 near-real time seismic stations. This presents unique challenges in automated event detection due to the high variability in data quality, network geometries and density, and distance-dependent variability in observed seismic signals. To lower the overall detection threshold while minimizing false detection rates, NEIC has begun to test the incorporation of new detection and picking algorithms, including multiband (Lomax et al., 2012) and kurtosis (Baillard et al., 2014) pickers, and a new bayesian associator (Glass 3.0). The Glass 3.0 associator allows for simultaneous processing of variably scaled detection grids, each with a unique set of nucleation criteria (e.g., nucleation threshold, minimum associated picks, nucleation phases) to meet specific monitoring goals. We test the efficacy of these new tools on event detection in networks of various scales and geometries, compare our results with previous catalogs, and discuss lessons learned. For example, we find that on local and regional scales, rapid nucleation of small events may require event nucleation with both P and higher-amplitude secondary phases (e.g., S or Lg). We provide examples of the implementation of a scale-independent associator for an induced seismicity sequence (local-scale), a large aftershock sequence (regional-scale), and for monitoring global seismicity. Baillard, C., Crawford, W. C., Ballu, V., Hibert, C., & Mangeney, A. (2014). An automatic kurtosis-based P-and S-phase picker designed for local seismic networks. Bulletin of the Seismological Society of America, 104(1), 394-409. Lomax, A., Satriano, C., & Vassallo, M. (2012). Automatic picker developments and optimization: FilterPicker - a robust, broadband picker for real-time seismic monitoring and earthquake early-warning, Seism. Res. Lett. , 83, 531-540, doi: 10

  15. Acceleration and volumetric strain generated by the Parkfield 2004 earthquake on the GEOS strong-motion array near Parkfield, California

    Science.gov (United States)

    Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher

    2004-01-01

    An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site

  16. Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California

    Science.gov (United States)

    McGarr, Arthur F.; Boettcher, M.; Fletcher, Jon Peter B.; Sell, Russell; Johnston, Malcolm J.; Durrheim, R.; Spottiswoode, S.; Milev, A.

    2009-01-01

    For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter , where  is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6.3 m/sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4.0 m/sec.

  17. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake.

    Science.gov (United States)

    Hunter, Jennifer C; Crawley, Adam W; Petrie, Michael; Yang, Jane E; Aragón, Tomás J

    2012-07-16

    Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami's impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergency responders' ability to rapidly communicate and coordinate a response to a potential threat. Methods To evaluate the local public health system emergency response to the tsunami threat in California, we surveyed all local public health, emergency medical services (EMS), and emergency management agencies in coastal or floodplain counties about several domains related to the tsunami threat in California, including: (1) the extent to which their community was affected by the tsunami, (2) when and how they received notification of the event, (3) which public health response activities were carried out to address the tsunami threat in their community, and (4) which organizations contributed to the response. Public health activities were characterized using the Centers for Disease Control and Prevention (CDC) Public Health Preparedness Capabilities (PHEP) framework. Findings The tsunami's impact on coastal communities in California ranged widely, both in terms of the economic consequences and the response activities. Based on estimates from the National Oceanic and Atmospheric Administration (NOAA), ten jurisdictions in California reported tsunami-related damage, which ranged from $15,000 to $35 million. Respondents first became aware of the tsunami threat in California between the hours of 10:00pm Pacific Standard Time (PST) on Thursday March 10th and 2:00pm PST on Friday March 11th, a range of 16 hours, with notification occurring through both formal and informal channels. In

  18. The 1989 Ms 7.1 Loma Prieta, California, Magnetic Earthquake Precursor Revisited

    Science.gov (United States)

    Thomas, J. N.; Love, J. J.; Johnston, M. J.

    2007-12-01

    Repeatable prediction of individual large earthquakes on the basis of quantitative geophysical data has proven to be frustratingly difficult and fraught with controversy. Still, some claims of success have been published, and among these are reports of identifiable precursory changes in magnetic-field activity as measured by ground- based magnetometers. By far the most prominent of such claims is that of Fraser-Smith et al., GRL, 17, 1465- 1468, 1990 who identified changes in Ultra Low Frequency (ULF, 0.01-10 Hz) magnetic noise prior to the 18 October 1989 Ms 7.1 Loma Prieta, California earthquake. The Fraser-Smith et al. result has been frequently cited in the literature, and it has been a major motivational influence for new research programs involving large arrays of ground-based instruments and even some satellite-based systems. We re-examine the data of the reported precursor, comparing them against independent data collected by magnetometers located in Japan and in the United States at the time of the Loma Prieta earthquake. From our analysis we infer that the key components of the precursory signal identified by Fraser-Smith et al. can be explained by minor corruption of the data in the form of a gain enhancement and time-stamp missassignment, possibly due to digital processing errors or inadvertent post-acquisitional treatment. We conclude that the reported magnetic anomaly is not related to the Loma Prieta earthquake.

  19. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  20. Comment on "Revisiting the 1872 owens valley, California, earthquake" by Susan E. Hough and Kate Hutton

    Science.gov (United States)

    Bakun, W.H.

    2009-01-01

    Bakun (2009) argues that the conclusions of Hough and Hutton (2008) are wrong because the study failed to take into account the Sierra Nevada attenuation model of Bakun (2006). In particular, Bakun (2009) argues that propagation effects can explain the relatively high intensities generated by the 1872 Owens Valley earthquake. Using an intensity attenuation model that attempts to account for attenuation through the Sierra Nevada, Bakun (2006) infers the magnitude estimate (Mw 7.4–7.5) that is currently accepted by National Earthquake Information Center (NEIC).

  1. GPS Time Series Analysis of Southern California Associated with the 2010 M7.2 El Mayor/Cucapah Earthquake

    Science.gov (United States)

    Granat, Robert; Donnellan, Andrea

    2011-01-01

    The Magnitude 7.2 El-Mayor/Cucapah earthquake the occurred in Mexico on April 4, 2012 was well instrumented with continuous GPS stations in California. Large Offsets were observed at the GPS stations as a result of deformation from the earthquake providing information about the co-seismic fault slip as well as fault slip from large aftershocks. Information can also be obtained from the position time series at each station.

  2. Comparative Study of Earthquake Clustering in Relation to Hydraulic Activities at Geothermal Fields in California

    Science.gov (United States)

    Martínez-Garzón, P.; Zaliapin, I. V.; Ben-Zion, Y.; Kwiatek, G.; Bohnhoff, M.

    2017-12-01

    We investigate earthquake clustering properties from three geothermal reservoirs to clarify how earthquake patterns respond to hydraulic activities. We process ≈ 9 years from four datasets corresponding to the Geysers (both the entire field and a local subset), Coso and Salton Sea geothermal fields, California. For each, the completeness magnitude, b-value and fractal dimension are calculated and used to identify seismicity clusters using the nearest-neighbor approach of Zaliapin and Ben-Zion [2013a, 2013b]. Estimations of temporal evolution of different clustering properties in relation to hydraulic parameters point to different responses of earthquake dynamics to hydraulic operations in each case study. The clustering at the Geysers at local scale and Salton Sea are most and least affected by hydraulic activities, respectively. The response of the earthquake clustering from different datasets to the hydraulic activities may reflect the regional seismo-tectonic complexity as well as the dimension of the geothermal activities performed (e.g. number of active wells and superposition of injection + production activities).Two clustering properties significantly respond to hydraulic changes across all datasets: the background rates and the proportion of clusters consisting of a single event. Background rates are larger at the Geysers and Coso during high injection-production periods, while the opposite holds for the Salton Sea. This possibly reflects the different physical mechanisms controlling seismicity at each geothermal field. Additionally, a lower proportion of singles is found during time periods with higher injection-production rates. This may reflect decreasing effective stress in areas subjected to higher pore pressure and larger earthquake triggering by stress transfer.

  3. Earthquakes.

    Science.gov (United States)

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  4. Spatial-temporal variation of low-frequency earthquake bursts near Parkfield, California

    Science.gov (United States)

    Wu, Chunquan; Guyer, Robert; Shelly, David R.; Trugman, D.; Frank, William; Gomberg, Joan S.; Johnson, P.

    2015-01-01

    Tectonic tremor (TT) and low-frequency earthquakes (LFEs) have been found in the deeper crust of various tectonic environments globally in the last decade. The spatial-temporal behaviour of LFEs provides insight into deep fault zone processes. In this study, we examine recurrence times from a 12-yr catalogue of 88 LFE families with ∼730 000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault (SAF) in central California. We apply an automatic burst detection algorithm to the LFE recurrence times to identify the clustering behaviour of LFEs (LFE bursts) in each family. We find that the burst behaviours in the northern and southern LFE groups differ. Generally, the northern group has longer burst duration but fewer LFEs per burst, while the southern group has shorter burst duration but more LFEs per burst. The southern group LFE bursts are generally more correlated than the northern group, suggesting more coherent deep fault slip and relatively simpler deep fault structure beneath the locked section of SAF. We also found that the 2004 Parkfield earthquake clearly increased the number of LFEs per burst and average burst duration for both the northern and the southern groups, with a relatively larger effect on the northern group. This could be due to the weakness of northern part of the fault, or the northwesterly rupture direction of the Parkfield earthquake.

  5. Slip on the San Andreas fault at Parkfield, California, over two earthquake cycles, and the implications for seismic hazard

    Science.gov (United States)

    Murray, J.; Langbein, J.

    2006-01-01

    Parkfield, California, which experienced M 6.0 earthquakes in 1934, 1966, and 2004, is one of the few locales for which geodetic observations span multiple earthquake cycles. We undertake a comprehensive study of deformation over the most recent earthquake cycle and explore the results in the context of geodetic data collected prior to the 1966 event. Through joint inversion of the variety of Parkfield geodetic measurements (trilateration, two-color laser, and Global Positioning System), including previously unpublished two-color data, we estimate the spatial distribution of slip and slip rate along the San Andreas using a fault geometry based on precisely relocated seismicity. Although the three most recent Parkfield earthquakes appear complementary in their along-strike distributions of slip, they do not produce uniform strain release along strike over multiple seismic cycles. Since the 1934 earthquake, more than 1 m of slip deficit has accumulated on portions of the fault that slipped in the 1966 and 2004 earthquakes, and an average of 2 m of slip deficit exists on the 33 km of the fault southeast of Gold Hill to be released in a future, perhaps larger, earthquake. It appears that the fault is capable of partially releasing stored strain in moderate earthquakes, maintaining a disequilibrium through multiple earthquake cycles. This complicates the application of simple earthquake recurrence models that assume only the strain accumulated since the most recent event is relevant to the size or timing of an upcoming earthquake. Our findings further emphasize that accumulated slip deficit is not sufficient for earthquake nucleation.

  6. Location of Moderate-Sized Earthquakes Recorded by the NARS-Baja Array in the Gulf of California Region Between 2002 and 2006

    Science.gov (United States)

    Castro, Raul R.; Perez-Vertti, Arturo; Mendez, Ignacio; Mendoza, Antonio; Inzunza, Luis

    2011-08-01

    We relocated the hypocentral coordinates of small to moderate-sized earthquakes reported by the National Earthquake Information Center (NEIC) between April 2002 and August 2006 in the Gulf of California region and recorded by the broadband stations of the network of autonomously recording seismographs (NARS-Baja array). The NARS-Baja array consists of 19 stations installed in the Baja California peninsula, Sonora and Sinaloa, Mexico. The events reported by the preliminary determinations of epicenters (PDE) catalog within the period of interest have moment magnitudes ( M w) ranging between 1.1 and 6.7. We estimated the hypocentral location of these events using P and S wave arrivals recorded by the regional broadband stations of the NARS-Baja and the RESBAN ( Red Sismológica de Banda Ancha) arrays and using a standard location procedure with the HYPOCENTER code ( Lienert and Havskov in Seism Res Lett 66:26-36, 1995) as a preliminary step. To refine the location of the initial hypocenters, we used the shrinking box source-specific station term method of Lin and Shearer (J Geophys Res 110, B04304, 2005). We found that most of the seismicity is distributed in the NW-SE direction along the axis of the Gulf of California, following a linear trend that, from north to south, steps southward near the main basins (Wagner, Delfin, Guaymas, Carmen, Farallon, Pescadero and Alarcon) and spreading centers. We compared the epicentral locations reported in the PDE with the locations obtained using regional arrival times, and we found that earthquakes with magnitudes in the range 3.2-5.0 mb differ on the average by as much as 43 km. For the M w magnitude range between 5 and 6.7 the discrepancy is less, differing on the average by about 25 km. We found that the relocated epicenters correlate well with the main bathymetric features of the Gulf.

  7. Caltech/USGS Southern California Seismic Network (SCSN): Infrastructure upgrade to support Earthquake Early Warning (EEW)

    Science.gov (United States)

    Bhadha, R. J.; Hauksson, E.; Boese, M.; Felizardo, C.; Thomas, V. I.; Yu, E.; Given, D. D.; Heaton, T. H.; Hudnut, K. W.

    2013-12-01

    The SCSN is the modern digital ground motion seismic network in Southern California and performs the following tasks: 1) Operates remote seismic stations and the central data processing systems in Pasadena; 2) Generates and reports real-time products including location, magnitude, ShakeMap, aftershock probabilities and others; 3) Responds to FEMA, CalOES, media, and public inquiries about earthquakes; 4) Manages the production, archival, and distribution of waveforms, phase picks, and other data at the SCEDC; 5) Contributes to development and implementation of the demonstration EEW system called CISN ShakeAlert. Initially, the ShakeAlert project was funded through the US Geological Survey (USGS) and in early 2012, the Gordon and Betty Moore Foundation provided three years of new funding for EEW research and development for the US west coast. Recently, we have also received some Urban Areas Security Initiative (UASI) funding to enhance the EEW capabilities for the local UASI region by making our system overall faster, more reliable and redundant than the existing system. The additional and upgraded stations will be capable of decreasing latency and ensuring data delivery by using more reliable and redundant telemetry pathways. Overall, this will enhance the reliability of the earthquake early warnings by providing denser station coverage and more resilient data centers than before. * Seismic Datalogger upgrade: replaces existing dataloggers with modern equipment capable of sending one-second uncompressed packets and utilizing redundant Ethernet telemetry. * GPS upgrade: replaces the existing GPS receivers and antennas, especially at "zipper array" sites near the major faults, with receivers that perform on-board precise point positioning to calculate position and velocity in real time and stream continuous data for use in EEW calculations. * New co-located seismic/GPS stations: increases station density and reduces early warning delays that are incurred by travel

  8. The Salton Seismic Imaging Project: Investigating Earthquake Hazards in the Salton Trough, Southern California

    Science.gov (United States)

    Fuis, G. S.; Goldman, M.; Sickler, R. R.; Catchings, R. D.; Rymer, M. J.; Rose, E. J.; Murphy, J. M.; Butcher, L. A.; Cotton, J. A.; Criley, C. J.; Croker, D. S.; Emmons, I.; Ferguson, A. J.; Gardner, M. A.; Jensen, E. G.; McClearn, R.; Loughran, C. L.; Slayday-Criley, C. J.; Svitek, J. F.; Hole, J. A.; Stock, J. M.; Skinner, S. M.; Driscoll, N. W.; Harding, A. J.; Babcock, J. M.; Kent, G.; Kell, A. M.; Harder, S. H.

    2011-12-01

    The Salton Seismic Imaging Project (SSIP) is a collaborative effort between academia and the U.S. Geological Survey to provide detailed, subsurface 3-D images of the Salton Trough of southern California and northern Mexico. From both active- and passive-source seismic data that were acquired both onshore and offshore (Salton Sea), the resulting images will provide insights into earthquake hazards, rift processes, and rift-transform interaction at the southern end of the San Andreas Fault system. The southernmost San Andreas Fault (SAF) is considered to be at high-risk of producing a large damaging earthquake, yet the structure of this and other regional faults and that of adjacent sedimentary basins is not currently well understood. Seismic data were acquired from 2 to 18 March 2011. One hundred and twenty-six borehole explosions (10-1400 kg yield) were detonated along seven profiles in the Salton Trough region, extending from area of Palm Springs, California, to the southwestern tip of Arizona. Airguns (1500 and 3500 cc) were fired along two profiles in the Salton Sea and at points in a 2-D array in the southern Salton Sea. Approximately 2800 seismometers were deployed at over 4200 locations throughout the Salton Trough region, and 48 ocean-bottom seismometers were deployed at 78 locations beneath the Salton Sea. Many of the onshore explosions were energetic enough to be recorded and located by the Southern California Seismograph Network. The geometry of the SAF has important implications for energy radiation in the next major rupture. Prior potential field, seismicity, and InSAR data indicate that the SAF may dip moderately to the northeast from the Salton Sea to Cajon Pass in the Transverse Ranges. Much of SSIP was designed to test models of this geometry.

  9. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California

    Science.gov (United States)

    Taira, Taka’aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes. PMID:29326977

  10. Transient stress-coupling between the 1992 Landers and 1999 Hector Mine, California, earthquakes

    Science.gov (United States)

    Masterlark, Timothy; Wang, H.F.

    2002-01-01

    A three-dimensional finite-element model (FEM) of the Mojave block region in southern California is constructed to investigate transient stress-coupling between the 1992 Landers and 1999 Hector Mine earthquakes. The FEM simulates a poroelastic upper-crust layer coupled to a viscoelastic lower-crust layer, which is decoupled from the upper mantle. FEM predictions of the transient mechanical behavior of the crust are constrained by global positioning system (GPS) data, interferometric synthetic aperture radar (InSAR) images, fluid-pressure data from water wells, and the dislocation source of the 1999 Hector Mine earthquake. Two time-dependent parameters, hydraulic diffusivity of the upper crust and viscosity of the lower crust, are calibrated to 10–2 m2·sec–1 and 5 × 1018 Pa·sec respectively. The hydraulic diffusivity is relatively insensitive to heterogeneous fault-zone permeability specifications and fluid-flow boundary conditions along the elastic free-surface at the top of the problem domain. The calibrated FEM is used to predict the evolution of Coulomb stress during the interval separating the 1992 Landers and 1999 Hector Mine earthquakes. The predicted change in Coulomb stress near the hypocenter of the Hector Mine earthquake increases from 0.02 to 0.05 MPa during the 7-yr interval separating the two events. This increase is primarily attributed to the recovery of decreased excess fluid pressure from the 1992 Landers coseismic (undrained) strain field. Coulomb stress predictions are insensitive to small variations of fault-plane dip and hypocentral depth estimations of the Hector Mine rupture.

  11. Monitoring reservoir response to earthquakes and fluid extraction, Salton Sea geothermal field, California.

    Science.gov (United States)

    Taira, Taka'aki; Nayak, Avinash; Brenguier, Florent; Manga, Michael

    2018-01-01

    Continuous monitoring of in situ reservoir responses to stress transients provides insights into the evolution of geothermal reservoirs. By exploiting the stress dependence of seismic velocity changes, we investigate the temporal evolution of the reservoir stress state of the Salton Sea geothermal field (SSGF), California. We find that the SSGF experienced a number of sudden velocity reductions (~0.035 to 0.25%) that are most likely caused by openings of fractures due to dynamic stress transients (as small as 0.08 MPa and up to 0.45 MPa) from local and regional earthquakes. Depths of velocity changes are estimated to be about 0.5 to 1.5 km, similar to the depths of the injection and production wells. We derive an empirical in situ stress sensitivity of seismic velocity changes by relating velocity changes to dynamic stresses. We also observe systematic velocity reductions (0.04 to 0.05%) during earthquake swarms in mid-November 2009 and late-December 2010. On the basis of volumetric static and dynamic stress changes, the expected velocity reductions from the largest earthquakes with magnitude ranging from 3 to 4 in these swarms are less than 0.02%, which suggests that these earthquakes are likely not responsible for the velocity changes observed during the swarms. Instead, we argue that velocity reductions may have been induced by poroelastic opening of fractures due to aseismic deformation. We also observe a long-term velocity increase (~0.04%/year) that is most likely due to poroelastic contraction caused by the geothermal production. Our observations demonstrate that seismic interferometry provides insights into in situ reservoir response to stress changes.

  12. Geodetic constraints on frictional properties and earthquake hazard in the Imperial Valley, Southern California

    Science.gov (United States)

    Lindsey, Eric O.; Fialko, Yuri

    2016-02-01

    We analyze a suite of geodetic observations across the Imperial Fault in southern California that span all parts of the earthquake cycle. Coseismic and postseismic surface slips due to the 1979 M 6.6 Imperial Valley earthquake were recorded with trilateration and alignment surveys by Harsh (1982) and Crook et al. (1982), and interseismic deformation is measured using a combination of multiple interferometric synthetic aperture radar (InSAR)-viewing geometries and continuous and survey-mode GPS. In particular, we combine more than 100 survey-mode GPS velocities with InSAR data from Envisat descending tracks 84 and 356 and ascending tracks 77 and 306 (149 total acquisitions), processed using a persistent scatterers method. The result is a dense map of interseismic velocities across the Imperial Fault and surrounding areas that allows us to evaluate the rate of interseismic loading and along-strike variations in surface creep. We compare available geodetic data to models of the earthquake cycle with rate- and state-dependent friction and find that a complete record of the earthquake cycle is required to constrain key fault properties including the rate-dependence parameter (a - b) as a function of depth, the extent of shallow creep, and the recurrence interval of large events. We find that the data are inconsistent with a high (>30 mm/yr) slip rate on the Imperial Fault and investigate the possibility that an extension of the San Jacinto-Superstition Hills Fault system through the town of El Centro may accommodate a significant portion of the slip previously attributed to the Imperial Fault. Models including this additional fault are in better agreement with the available observations, suggesting that the long-term slip rate of the Imperial Fault is lower than previously suggested and that there may be a significant unmapped hazard in the western Imperial Valley.

  13. Rates and patterns of surface deformation from laser scanning following the South Napa earthquake, California

    Science.gov (United States)

    DeLong, Stephen B.; Lienkaemper, James J.; Pickering, Alexandra J; Avdievitch, Nikita N.

    2015-01-01

    The A.D. 2014 M6.0 South Napa earthquake, despite its moderate magnitude, caused significant damage to the Napa Valley in northern California (USA). Surface rupture occurred along several mapped and unmapped faults. Field observations following the earthquake indicated that the magnitude of postseismic surface slip was likely to approach or exceed the maximum coseismic surface slip and as such presented ongoing hazard to infrastructure. Using a laser scanner, we monitored postseismic deformation in three dimensions through time along 0.5 km of the main surface rupture. A key component of this study is the demonstration of proper alignment of repeat surveys using point cloud–based methods that minimize error imposed by both local survey errors and global navigation satellite system georeferencing errors. Using solid modeling of natural and cultural features, we quantify dextral postseismic displacement at several hundred points near the main fault trace. We also quantify total dextral displacement of initially straight cultural features. Total dextral displacement from both coseismic displacement and the first 2.5 d of postseismic displacement ranges from 0.22 to 0.29 m. This range increased to 0.33–0.42 m at 59 d post-earthquake. Furthermore, we estimate up to 0.15 m of vertical deformation during the first 2.5 d post-earthquake, which then increased by ∼0.02 m at 59 d post-earthquake. This vertical deformation is not expressed as a distinct step or scarp at the fault trace but rather as a broad up-to-the-west zone of increasing elevation change spanning the fault trace over several tens of meters, challenging common notions about fault scarp development in strike-slip systems. Integrating these analyses provides three-dimensional mapping of surface deformation and identifies spatial variability in slip along the main fault trace that we attribute to distributed slip via subtle block rotation. These results indicate the benefits of laser scanner surveys along

  14. Sonoma Ecology Center Northern California Arundo Distribution Data

    Data.gov (United States)

    California Department of Resources — The Arundo Distribution layer is a compilation of Arundo donax observations in northern and central California, obtained from numerous sources, including Arundo...

  15. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    Science.gov (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  16. G-larmS: An Infrastructure for Geodetic Earthquake Early Warning, applied to Northern California

    Science.gov (United States)

    Johanson, I. A.; Grapenthin, R.; Allen, R. M.

    2014-12-01

    Integrating geodetic data into seismic earthquake early warning (EEW) is critical for accurately resolving magnitude and finite fault dimensions in the very largest earthquakes (M>7). We have developed G-larmS, the Geodetic alarm System, as part of our efforts to incorporate geodetic data into EEW for Northern California. G-larmS is an extensible geodetic EEW infrastructure that analyzes positioning time series from real-time GPS processors, such as TrackRT or RTNET. It is currently running in an operational mode at the Berkeley Seismological Laboratory (BSL) where we use TrackRT to produce high sample rate displacement time series for 62 GPS stations in the greater San Francisco Bay Area with 3-4 second latency. We employ a fully triangulated network scheme, which provides resiliency against an outage or telemetry loss at any individual station, for a total of 165 basestation-rover pairs. G-larmS is tightly integrated into seismic alarm systems (CISN ShakeAlert, ElarmS) as it uses their P-wave detection alarms to trigger its own processing and sends warning messages back to the ShakeAlert decision module. Once triggered, G-larmS estimates the static offset at each station pair and inputs these into an inversion for fault slip, which is updated once per second. The software architecture and clear interface definitions of this Python implementation enable straightforward extensibility and exchange of specific algorithms that operate in the individual modules. For example, multiple modeling instances can be called in parallel, each of which applying a different strategy to infer fault and magnitude information (e.g., pre-defined fault planes, full grid search, least squares inversion, etc.). This design enables, for example, quick tests, expansion and algorithm comparisons. Here, we present the setup and report results of the first months of operation in Northern California. This includes analysis of system latencies, noise, and G-larmS' response to actual events. We

  17. Uniform California earthquake rupture forecast, version 3 (UCERF3): the time-independent model

    Science.gov (United States)

    Field, Edward H.; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David D.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin R.; Page, Morgan T.; Parsons, Thomas; Powers, Peter M.; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua; ,

    2013-01-01

    In this report we present the time-independent component of the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3), which provides authoritative estimates of the magnitude, location, and time-averaged frequency of potentially damaging earthquakes in California. The primary achievements have been to relax fault segmentation assumptions and to include multifault ruptures, both limitations of the previous model (UCERF2). The rates of all earthquakes are solved for simultaneously, and from a broader range of data, using a system-level "grand inversion" that is both conceptually simple and extensible. The inverse problem is large and underdetermined, so a range of models is sampled using an efficient simulated annealing algorithm. The approach is more derivative than prescriptive (for example, magnitude-frequency distributions are no longer assumed), so new analysis tools were developed for exploring solutions. Epistemic uncertainties were also accounted for using 1,440 alternative logic tree branches, necessitating access to supercomputers. The most influential uncertainties include alternative deformation models (fault slip rates), a new smoothed seismicity algorithm, alternative values for the total rate of M≥5 events, and different scaling relationships, virtually all of which are new. As a notable first, three deformation models are based on kinematically consistent inversions of geodetic and geologic data, also providing slip-rate constraints on faults previously excluded because of lack of geologic data. The grand inversion constitutes a system-level framework for testing hypotheses and balancing the influence of different experts. For example, we demonstrate serious challenges with the Gutenberg-Richter hypothesis for individual faults. UCERF3 is still an approximation of the system, however, and the range of models is limited (for example, constrained to stay close to UCERF2). Nevertheless, UCERF3 removes the apparent UCERF2 overprediction of

  18. EFFECTS OF THE 1983 COALINGA, CALIFORNIA, EARTHQUAKE ONCREEP ALONG THE SAN ADREAS FAULT.

    Science.gov (United States)

    Mavko, Gerald M.; Schulz, Sandra; Brown, Beth D.

    1985-01-01

    The M//L approximately equals 6. 5 earthquake that occurred near Coalinga, California, on May 2, 1983 induced changes in near-surface fault slip along the San Andreas fault. Coseismic steps were observed by creepmeters along a 200-km section of the San Andreas. some of the larger aftershocks induced additional steps, both right-lateral and left-lateral, and in general the sequence disrupted observed creep at several sites from preseismic long-term patterns. Static dislocation models can approximately explain the magnitudes and distribution of the larger coseismic steps on May 2. The smaller, more distant steps appear to be the abrupt release of accumulated slip, triggered by the coseismic strain changes, but independent of the strain change amplitudes.

  19. Comparison of four moderate-size earthquakes in southern California using seismology and InSAR

    Science.gov (United States)

    Mellors, R.J.; Magistrale, H.; Earle, P.; Cogbill, A.H.

    2004-01-01

    Source parameters determined from interferometric synthetic aperture radar (InSAR) measurements and from seismic data are compared from four moderate-size (less than M 6) earthquakes in southern California. The goal is to verify approximate detection capabilities of InSAR, assess differences in the results, and test how the two results can be reconciled. First, we calculated the expected surface deformation from all earthquakes greater than magnitude 4 in areas with available InSAR data (347 events). A search for deformation from the events in the interferograms yielded four possible events with magnitudes less than 6. The search for deformation was based on a visual inspection as well as cross-correlation in two dimensions between the measured signal and the expected signal. A grid-search algorithm was then used to estimate focal mechanism and depth from the InSAR data. The results were compared with locations and focal mechanisms from published catalogs. An independent relocation using seismic data was also performed. The seismic locations fell within the area of the expected rupture zone for the three events that show clear surface deformation. Therefore, the technique shows the capability to resolve locations with high accuracy and is applicable worldwide. The depths determined by InSAR agree with well-constrained seismic locations determined in a 3D velocity model. Depth control for well-imaged shallow events using InSAR data is good, and better than the seismic constraints in some cases. A major difficulty for InSAR analysis is the poor temporal coverage of InSAR data, which may make it impossible to distinguish deformation due to different earthquakes at the same location.

  20. Health information technology adoption in California community health centers.

    Science.gov (United States)

    Kim, Katherine K; Rudin, Robert S; Wilson, Machelle D

    2015-12-01

    National and state initiatives to spur adoption of electronic health records (EHRs) and health information exchange (HIE) among providers in rural and underserved communities have been in place for 15 years. Our goal was to systematically assess the impact of these initiatives by quantifying the level of adoption and key factors associated with adoption among community health centers in California. Cross-sectional statewide survey. We conducted a telephone survey of all California primary care community health centers (CHCs) from August to September 2013. Multiple logistic regressions were fit to test for associations between various practice characteristics and adoption of EHRs, meaningful use-certified EHRs, and HIE. For the multivariable model, we included those variables which were significant at the P = .10 level in the univariate tests. We received responses from 194 CHCs (73.5% response rate). Adoption of any EHRs (80.3%) and meaningful use-certified EHRs (94.6% of those with an EHR) was very high. Adoption of HIE is substantial (48.7%) and took place within a few years (mean = 2.61 years; SD = 2.01). More than half (54.7%) of CHCs are able to receive data into the EHR indicating some level of interoperability. Patient engagement capacity is moderate, with 21.6% offering a PHR, and 55.2% electronic visit summaries. Rural location and belonging to a multi-site clinic organization both increase the odds of adoption of EHRs, HIE, and electronic visit summary, with the odds ratio ranging from 0.63 to 3.28 (all P values adoption of health information technology (IT) in rural areas may be the result of both federal and state investments. As CHCs lack access to capital for investments, continued support of technology infrastructure may be needed for them to further leverage health IT to improve healthcare.

  1. A Further Review of the California State University's Contra Costa Center. Commission Report 89-9.

    Science.gov (United States)

    California State Postsecondary Education Commission, Sacramento.

    A follow-up report on the California State University's Contra Costa Center, a proposed permanent off-campus center, is presented. The California Postsecondary Education Commission approved the original proposal in 1987, contingent on finding solutions to concerns about transportation access and services to disadvantaged students. The university…

  2. The Napa (California, US) earthquake of 24 August 2014 (10.24 UT) Magnitude = 6.0

    International Nuclear Information System (INIS)

    Scotti, Oona

    2014-01-01

    This publication briefly presents the characteristics of an earthquake which occurred in California in August 2014, indicates some data recorded by local seismic stations, and gives a brief overview of human and economic damages. It analyses the geological location of the earthquake, recalls previous events and outlines the local seismic risk. After having noticed that there was no consequence for the closest nuclear power station (300 km away), it indicates lessons learned in terms of seismic event about a crack, in order to better assess the risk of surface failure

  3. Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, J.D. [Stevenson and Associates, Cleveland, OH (United States)

    1995-11-01

    Volume 2 of the ``Survey of Strong Motion Earthquake Effects on Thermal Power Plants in California with Emphasis on Piping Systems`` contains Appendices which detail the detail design and seismic response of several power plants subjected to strong motion earthquakes. The particular plants considered include the Ormond Beach, Long Beach and Seal Beach, Burbank, El Centro, Glendale, Humboldt Bay, Kem Valley, Pasadena and Valley power plants. Included is a typical power plant piping specification and photographs of typical power plant piping specification and photographs of typical piping and support installations for the plants surveyed. Detailed piping support spacing data are also included.

  4. Local Public Health System Response to the Tsunami Threat in Coastal California following the Tōhoku Earthquake

    OpenAIRE

    Hunter, Jennifer C.; Crawley, Adam W.; Petrie, Michael; Yang, Jane E.; Aragón, Tomás J.

    2012-01-01

    Background On Friday March 11, 2011 a 9.0 magnitude earthquake triggered a tsunami off the eastern coast of Japan, resulting in thousands of lives lost and billions of dollars in damage around the Pacific Rim. The tsunami first reached the California coast on Friday, March 11th, causing more than $70 million in damage and at least one death. While the tsunami’s impact on California pales in comparison to the destruction caused in Japan and other areas of the Pacific, the event tested emergenc...

  5. Preliminary Results on Earthquake Recurrence Intervals, Rupture Segmentation, and Potential Earthquake Moment Magnitudes along the Tahoe-Sierra Frontal Fault Zone, Lake Tahoe, California

    Science.gov (United States)

    Howle, J.; Bawden, G. W.; Schweickert, R. A.; Hunter, L. E.; Rose, R.

    2012-12-01

    Utilizing high-resolution bare-earth LiDAR topography, field observations, and earlier results of Howle et al. (2012), we estimate latest Pleistocene/Holocene earthquake-recurrence intervals, propose scenarios for earthquake-rupture segmentation, and estimate potential earthquake moment magnitudes for the Tahoe-Sierra frontal fault zone (TSFFZ), west of Lake Tahoe, California. We have developed a new technique to estimate the vertical separation for the most recent and the previous ground-rupturing earthquakes at five sites along the Echo Peak and Mt. Tallac segments of the TSFFZ. At these sites are fault scarps with two bevels separated by an inflection point (compound fault scarps), indicating that the cumulative vertical separation (VS) across the scarp resulted from two events. This technique, modified from the modeling methods of Howle et al. (2012), uses the far-field plunge of the best-fit footwall vector and the fault-scarp morphology from high-resolution LiDAR profiles to estimate the per-event VS. From this data, we conclude that the adjacent and overlapping Echo Peak and Mt. Tallac segments have ruptured coseismically twice during the Holocene. The right-stepping, en echelon range-front segments of the TSFFZ show progressively greater VS rates and shorter earthquake-recurrence intervals from southeast to northwest. Our preliminary estimates suggest latest Pleistocene/ Holocene earthquake-recurrence intervals of 4.8±0.9x103 years for a coseismic rupture of the Echo Peak and Mt. Tallac segments, located at the southeastern end of the TSFFZ. For the Rubicon Peak segment, northwest of the Echo Peak and Mt. Tallac segments, our preliminary estimate of the maximum earthquake-recurrence interval is 2.8±1.0x103 years, based on data from two sites. The correspondence between high VS rates and short recurrence intervals suggests that earthquake sequences along the TSFFZ may initiate in the northwest part of the zone and then occur to the southeast with a lower

  6. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    Science.gov (United States)

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  7. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2012-08-31

    ... hazard and risk forecasts; status of the project intended to deliver an updated Uniform California... Earthquake Information Center on the campus of the Colorado School of Mines, 1711 Illinois Avenue, in Golden...

  8. Environmental Survey preliminary report, Stanford Linear Accelerator Center, Stanford, California

    Energy Technology Data Exchange (ETDEWEB)

    1988-07-01

    This report presents the preliminary findings from the first phase of the Survey of the US Department of Energy (DOE) Stanford Linear Accelerator Center (SLAC) at Stanford, California, conducted February 29 through March 4, 1988. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the SLAC. The Survey covers all environmental media and all areas of environmental regulation and is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations at the SLAC, and interviews with site personnel. The Survey team is developing a Sampling and Analysis Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by a DOE National Laboratory or a support contractor. When completed, the results will be incorporated into the Environmental Survey Interim Report for the SLAC facility. The Interim Report will reflect the final determinations of the SLAC Survey. 95 refs., 25 figs., 25 tabs.

  9. Environmental Survey preliminary report, Stanford Linear Accelerator Center, Stanford, California

    International Nuclear Information System (INIS)

    1988-07-01

    This report presents the preliminary findings from the first phase of the Survey of the US Department of Energy (DOE) Stanford Linear Accelerator Center (SLAC) at Stanford, California, conducted February 29 through March 4, 1988. The Survey is being conducted by an interdisciplinary team of environmental specialists, led and managed by the Office of Environment, Safety and Health's Office of Environmental Audit. Individual team components are being supplied by a private contractor. The objective of the Survey is to identify environmental problems and areas of environmental risk associated with the SLAC. The Survey covers all environmental media and all areas of environmental regulation and is being performed in accordance with the DOE Environmental Survey Manual. This phase of the Survey involves the review of existing site environmental data, observations of the operations at the SLAC, and interviews with site personnel. The Survey team is developing a Sampling and Analysis Plan to assist in further assessing certain of the environmental problems identified during its on-site activities. The Sampling and Analysis Plan will be executed by a DOE National Laboratory or a support contractor. When completed, the results will be incorporated into the Environmental Survey Interim Report for the SLAC facility. The Interim Report will reflect the final determinations of the SLAC Survey. 95 refs., 25 figs., 25 tabs

  10. Evaluation of Real-Time Performance of the Virtual Seismologist Earthquake Early Warning Algorithm in Switzerland and California

    Science.gov (United States)

    Behr, Y.; Cua, G. B.; Clinton, J. F.; Heaton, T. H.

    2012-12-01

    The Virtual Seismologist (VS) method is a Bayesian approach to regional network-based earthquake early warning (EEW) originally formulated by Cua and Heaton (2007). Implementation of VS into real-time EEW codes has been an on-going effort of the Swiss Seismological Service at ETH Zürich since 2006, with support from ETH Zürich, various European projects, and the United States Geological Survey (USGS). VS is one of three EEW algorithms - the other two being ElarmS (Allen and Kanamori, 2003) and On-Site (Wu and Kanamori, 2005; Boese et al., 2008) algorithms - that form the basis of the California Integrated Seismic Network (CISN) ShakeAlert system, a USGS-funded prototype end-to-end EEW system that could potentially be implemented in California. In Europe, VS is currently operating as a real-time test system in Switzerland. As part of the on-going EU project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction), VS will be installed and tested at other European networks. VS has been running in real-time on stations of the Southern California Seismic Network (SCSN) since July 2008, and on stations of the Berkeley Digital Seismic Network (BDSN) and the USGS Menlo Park strong motion network in northern California since February 2009. In Switzerland, VS has been running in real-time on stations monitored by the Swiss Seismological Service (including stations from Austria, France, Germany, and Italy) since 2010. We present summaries of the real-time performance of VS in Switzerland and California over the past two and three years respectively. The empirical relationships used by VS to estimate magnitudes and ground motion, originally derived from southern California data, are demonstrated to perform well in northern California and Switzerland. Implementation in real-time and off-line testing in Europe will potentially be extended to southern Italy, western Greece, Istanbul, Romania, and Iceland. Integration of the VS algorithm into both the CISN Advanced

  11. Remotely triggered microearthquakes and tremor in central California following the 2010 Mw 8.8 Chile earthquake

    Science.gov (United States)

    Peng, Zhigang; Hill, David P.; Shelly, David R.; Aiken, Chastity

    2010-01-01

    We examine remotely triggered microearthquakes and tectonic tremor in central California following the 2010 Mw 8.8 Chile earthquake. Several microearthquakes near the Coso Geothermal Field were apparently triggered, with the largest earthquake (Ml 3.5) occurring during the large-amplitude Love surface waves. The Chile mainshock also triggered numerous tremor bursts near the Parkfield-Cholame section of the San Andreas Fault (SAF). The locally triggered tremor bursts are partially masked at lower frequencies by the regionally triggered earthquake signals from Coso, but can be identified by applying high-pass or matched filters. Both triggered tremor along the SAF and the Ml 3.5 earthquake in Coso are consistent with frictional failure at different depths on critically-stressed faults under the Coulomb failure criteria. The triggered tremor, however, appears to be more phase-correlated with the surface waves than the triggered earthquakes, likely reflecting differences in constitutive properties between the brittle, seismogenic crust and the underlying lower crust.

  12. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    Science.gov (United States)

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul S.; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  13. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    Science.gov (United States)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  14. Development of Earthquake Early Warning System in Southern California Using Real Time GPS and Seismic Data

    Science.gov (United States)

    Squibb, M. B.; Bock, Y.; Crowell, B. W.; Jamason, P.; Fang, P.; Yu, E.; Clayton, R. W.; Kedar, S.; Webb, F.; Bar-Sever, Y.; Miller, K. J.

    2009-12-01

    We discuss the fusion of low-latency (1 s) high-rate (1 Hz or greater) CGPS total displacement waveforms and traditional seismic data, in order to extend the frequency range and timeliness of surface displacement data already available at lower frequencies from space borne InSAR and (typically daily) CGPS coordinate time series. The goal of our NASA AIST project is to develop components of early warning systems for mitigation of geological hazards (direct seismic damage, tsunamis, landslides, volcanoes). The advantage of the GPS data is that it is a direct measurement of ground displacement. With seismic data, this type of measure has to be obtained by deconvolution of the instrument response and integration of the broadband (velocity) measurements, or a double integration of the strong motion (acceleration) measurements. Due to the bandwidth and the dynamic range limits of seismometers the accuracy of absolute displacements so derived is poor. This problem is not present in the high-sample rate GPS data. We have developed a multi-rate Kalman filter that can combine in real time the complementary GPS and seismic data for use in an earthquake early warning (EEW) system, which results in an improved determination of total displacement waveforms by taking advantage of the strong points of each data type. While the seismic measurement provides a powerful constraint on the much noisier GPS measurements, unlike the seismometer, the GPS receiver never clips. We have identified about 25 “co-located” real-time GPS and broadband seismic stations (STS-1, STS-2, and CMG-3T instruments) in southern California. We are currently addressing issues related to data formats and metadata exchange, which will allow us to efficiently combine the two data types in the multi-rate Kalman filter. We describe the elements of the EEW system for southern California, discuss issues of detection and characterization of signals, and consider minimization of false alarms. We show an example of

  15. Ring-Shaped Seismicity Structures in Southern California: Possible Preparation for Large Earthquake in the Los Angeles Basin

    Science.gov (United States)

    Kopnichev, Yu. F.; Sokolova, I. N.

    2017-12-01

    Some characteristics of seismicity in Southern California are studied. It is found that ring-shaped seismicity structures with threshold magnitudes M th of 4.1, 4.1, and 3.8 formed prior to three large ( M w > 7.0) earthquakes in 1992, 1999, and 2010, respectively. The sizes of these structures are several times smaller than for intracontinental strike-slip events with similar magnitudes. Two ring-shaped structures are identified in areas east of the city of Los Angeles, where relatively large earthquakes have not occurred for at least 150 years. The magnitudes of large events which can occur in the areas of these structures are estimated on the basis of the previously obtained correlation dependence of ring sizes on magnitudes of the strike-slip earthquakes. Large events with magnitudes of M w = 6.9 ± 0.2 and M w = 8.6 ± 0.2 can occur in the area to the east of the city of Los Angeles and in the rupture zone of the 1857 great Fort Tejon earthquake, respectively. We believe that ring-structure formation, similarly to the other regions, is connected with deep-seated fluid migration.

  16. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    Science.gov (United States)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  17. Potential Effects of a Scenario Earthquake on the Economy of Southern California: Labor Market Exposure and Sensitivity Analysis to a Magnitude 7.8 Earthquake

    Science.gov (United States)

    Sherrouse, Benson C.; Hester, David J.; Wein, Anne M.

    2008-01-01

    The Multi-Hazards Demonstration Project (MHDP) is a collaboration between the U.S. Geological Survey (USGS) and various partners from the public and private sectors and academia, meant to improve Southern California's resiliency to natural hazards (Jones and others, 2007). In support of the MHDP objectives, the ShakeOut Scenario was developed. It describes a magnitude 7.8 (M7.8) earthquake along the southernmost 300 kilometers (200 miles) of the San Andreas Fault, identified by geoscientists as a plausible event that will cause moderate to strong shaking over much of the eight-county (Imperial, Kern, Los Angeles, Orange, Riverside, San Bernardino, San Diego, and Ventura) Southern California region. This report contains an exposure and sensitivity analysis of economic Super Sectors in terms of labor and employment statistics. Exposure is measured as the absolute counts of labor market variables anticipated to experience each level of Instrumental Intensity (a proxy measure of damage). Sensitivity is the percentage of the exposure of each Super Sector to each Instrumental Intensity level. The analysis concerns the direct effect of the scenario earthquake on economic sectors and provides a baseline for the indirect and interactive analysis of an input-output model of the regional economy. The analysis is inspired by the Bureau of Labor Statistics (BLS) report that analyzed the labor market losses (exposure) of a M6.9 earthquake on the Hayward fault by overlaying geocoded labor market data on Instrumental Intensity values. The method used here is influenced by the ZIP-code-level data provided by the California Employment Development Department (CA EDD), which requires the assignment of Instrumental Intensities to ZIP codes. The ZIP-code-level labor market data includes the number of business establishments, employees, and quarterly payroll categorized by the North American Industry Classification System. According to the analysis results, nearly 225,000 business

  18. Stress triggering of earthquakes: evidence for the 1994 M = 6.7 Northridge, California, shock

    Directory of Open Access Journals (Sweden)

    G. C. P. King

    1994-06-01

    Full Text Available A model of stress transfer implies that earthquakes in 1933 and 1952 increased the Conlomb stress at the site of the 1971 San Fernando earthquake. The 1971 earthquake in turn raised stress and produced aftershocks at the site of the 1987 Whittier Narrows and 1994 Northridge ruptures. The Northridge main shock raised stress in areas where its aftershocks and surface faulting occurred. Together, M ? 6 earthquakes near Los Angeles since 1933 have stressed parts of the Oak Ridge, Sierra Madre, Santa Monica Mountains, Elysian Park, and Newport-Inglewood faults by > 1 bar. While too small to cause earthquakes, these stress changes can trigger events if the crust is already near failure, or advance future earthquake occurrence if it is not.

  19. Unusual downhole and surface free-field records near the Carquinez Strait bridges during the 24 August 2014 Mw6.0 South Napa, California earthquake

    Science.gov (United States)

    Çelebi, Mehmet; Ghahari, S. Farid; Taciroglu, Ertugrul

    2015-01-01

    This paper reports the results of Part A of a study of the recorded strong-motion accelerations at the well-instrumented network of the two side-by-side parallel bridges over the Carquinez Strait during the 24 August 2014 (Mw6.0 ) South Napa, Calif. earthquake that occurred at 03:20:44 PDT with epicentral coordinates 38.22N, 122.31W. (http://earthquake.usgs.gov/earthquakes/eqarchives/poster/2014/20140824.php, last accessed on October 17, 2014). Both bridges and two boreholes were instrumented by the California Strong motion Instrumentation Program (CSMIP) of California Geological Survey (CGS) (Shakal et al., 2014). A comprehensive comparison of several ground motion prediction equations as they relate to recorded ground motions of the earthquake is provided by Baltay and Boatright (2015).

  20. Current progress in using multiple electromagnetic indicators to determine location, time, and magnitude of earthquakes in California and Peru (Invited)

    Science.gov (United States)

    Bleier, T. E.; Dunson, C.; Roth, S.; Heraud, J.; Freund, F. T.; Dahlgren, R.; Bryant, N.; Bambery, R.; Lira, A.

    2010-12-01

    Since ultra-low frequency (ULF) magnetic anomalies were discovered prior to the 1989 Loma Prieta, Ca. M7.0 earthquake, QuakeFinder, a small R&D group based in Palo Alto California has systematically monitored ULF magnetic signals with a network of 3-axis induction magnetometers since 2000 in California. This raw magnetometer data was collected at 20-50 samples per sec., with no preprocessing, in an attempt to collect an accurate time history of electromagnetic waveforms prior to, during, and after large earthquakes within 30 km. of these sensors. Finally in October 2007, the QuakeFinder team observed a series of strange magnetic pulsations at the Alum Rock, California site, 14 days prior to M5.4 earthquake. These magnetic signals observed were relatively short, random pulsations, not continuous waveform signals like Pc1 or Pc3 micropulsations. The magnetic pulses have a characteristic uni-polar shapes and 0.5 sec. to 30 sec. durations, much longer than lightning signals. In May of 2010, very similar pulses were observed at Tacna, Peru, 13 days prior to a M6.2 earthquake, using a QuakeFinder station jointly operated under collaboration with the Catholic University in Lima Peru (PUCP). More examples of these pulsations were sought, and a historical review of older California magnetic data discovered fewer but similar pulsations occurred at the Hollister, Ca. site operated by UC Berkeley (e.g. San Juan Bautista M5.1 earthquake on August 12, 1998). Further analysis of the direction of arrival of the magnetic pulses showed an interesting “azimuth clustering” observed in both Alum Rock, Ca. and Tacna, Peru data. The complete time series of the Alum Rock data allowed the team to analyze subsequent changes observed in magnetometer “filter banks” (0.001 Hz to 10 Hz filter bands, similar to those used by Fraser-Smith in 1989), but this time using time-adjusted limits based on time of day, time of year, Kp, and site background noise. These site-customized limits

  1. Earthquake-by-earthquake fold growth above the Puente Hills blind thrust fault, Los Angeles, California: Implications for fold kinematics and seismic hazard

    Science.gov (United States)

    Leon, Lorraine A.; Christofferson, Shari A.; Dolan, James F.; Shaw, John H.; Pratt, Thomas L.

    2007-03-01

    Boreholes and high-resolution seismic reflection data collected across the forelimb growth triangle above the central segment of the Puente Hills thrust fault (PHT) beneath Los Angeles, California, provide a detailed record of incremental fold growth during large earthquakes on this major blind thrust fault. These data document fold growth within a discrete kink band that narrows upward from ˜460 m at the base of the Quaternary section (200-250 m depth) to 82% at 250 m depth) folding and uplift occur within discrete kink bands, thereby enabling us to develop a paleoseismic history of the underlying blind thrust fault. The borehole data reveal that the youngest part of the growth triangle in the uppermost 20 m comprises three stratigraphically discrete growth intervals marked by southward thickening sedimentary strata that are separated by intervals in which sediments do not change thickness across the site. We interpret the intervals of growth as occurring after the formation of now-buried paleofold scarps during three large PHT earthquakes in the past 8 kyr. The intervening intervals of no growth record periods of structural quiescence and deposition at the regional, near-horizontal stream gradient at the study site. Minimum uplift in each of the scarp-forming events, which occurred at 0.2-2.2 ka (event Y), 3.0-6.3 ka (event X), and 6.6-8.1 ka (event W), ranged from ˜1.1 to ˜1.6 m, indicating minimum thrust displacements of ≥2.5 to 4.5 m. Such large displacements are consistent with the occurrence of large-magnitude earthquakes (Mw > 7). Cumulative, minimum uplift in the past three events was 3.3 to 4.7 m, suggesting cumulative thrust displacement of ≥7 to 10.5 m. These values yield a minimum Holocene slip rate for the PHT of ≥0.9 to 1.6 mm/yr. The borehole and seismic reflection data demonstrate that dip within the kink band is acquired incrementally, such that older strata that have been deformed by more earthquakes dip more steeply than younger strata

  2. COMPARING SEA LEVEL RESPONSE AT MONTEREY, CALIFORNIA FROM THE 1989 LOMA PRIETA EARTHQUAKE AND THE 1964 GREAT ALASKAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    L. C. Breaker

    2009-01-01

    Full Text Available Two of the largest earthquakes to affect water levels in Monterey Bay in recent years were the Loma Prieta Earthquake (LPE of 1989 with a moment magnitude of 6.9, and the Great Alaskan Earthquake (GAE of 1964 with a moment magnitude of 9.2. In this study, we compare the sea level response of these events with a primary focus on their frequency content and how the bay affected it, itself. Singular Spectrum Analysis (SSA was employed to extract the primary frequencies associated with each event. It is not clear how or exactly where the tsunami associated with the LPE was generated, but it occurred inside the bay and most likely began to take on the characteristics of a seiche by the time it reached the tide gauge in Monterey Harbor. Results of the SSA decomposition revealed two primary periods of oscillation, 9-10 minutes, and 31-32 minutes. The first oscillation is in agreement with the range of periods for the expected natural oscillations of Monterey Harbor, and the second oscillation is consistent with a bay-wide oscillation or seiche mode. SSA decomposition of the GAE revealed several sequences of oscillations all with a period of approximately 37 minutes, which corresponds to the predicted, and previously observed, transverse mode of oscillation for Monterey Bay. In this case, it appears that this tsunami produced quarter-wave resonance within the bay consistent with its seiche-like response. Overall, the sea level responses to the LPE and GAE differed greatly, not only because of the large difference in their magnitudes but also because the driving force in one case occurred inside the bay (LPE, and in the second, outside the bay (GAE. As a result, different modes of oscillation were excited.

  3. Earthquake warning system for Japan Railways’ bullet train; implications for disaster prevention in California

    Science.gov (United States)

    Nakamura, Y.; Tucker, B. E.

    1988-01-01

    In Japan, the level of public awareness of the dangers of earthquakes is high. The 1923 Kanto earthquake killed about 120,000 people out of a total Japanese population of about 50 million; an equivalent disaster in the U.S would involve 600,000 deaths.

  4. Earthquake Hazard and Segmented Fault Evolution, Hat Creek Fault, Northern California

    Science.gov (United States)

    Blakeslee, M. W.; Kattenhorn, S. A.

    2010-12-01

    Precise insight into surface rupture and the evolution and mechanical interaction of segmented normal fault systems is critical for assessing the potential seismic hazard. The Hat Creek fault is a ~35 km long, NNW trending segmented normal fault system located on the western boundary of the Modoc Plateau and within the extending backarc basin of the Cascadia subduction zone in northern California. The Hat Creek fault has a prominent surface rupture showing evidence of multiple events in the past 15 ka, although there have been no historic earthquakes. In response to interactions with volcanic activity, the fault system has progressively migrated several km westward, causing older scarps to become seemingly inactive, and producing three distinct, semi-parallel scarps with different ages. The oldest scarp, designated the “Rim”, is the farthest west and has up to 352 m of throw. The relatively younger “Pali” scarp has up to 174 m of throw. The young “Active” scarp has a maximum throw of 65 m in the 24±6 ka Hat Creek basalt, with 20 m of throw in ~15 ka glacial gravels (i.e., a Holocene slip rate of ~1.3 mm/yr). Changes in the geometry and kinematics of the separate scarps during the faulting history imply the orientation of the stress field has rotated clockwise, now inducing oblique right-lateral motion. Previous studies suggested that the Active scarp consists of 7 left-stepping segments with a cumulative length of 23.5 km. We advocate that the Active scarp is actually composed of 8 or 9 segments and extends 4 km longer than previous estimates. This addition to the active portion of the fault is based on detailed mapping of a young surface rupture in the northern portion of the fault system. This ~30 m high young scarp offsets lavas that erupted from Cinder Butte, a low shield volcano, but has a similar geometry and properties as the Active scarp in the Hat Creek basalt. At its northern end, the Active scarp terminates at Cinder Butte. Our mapping

  5. GPS Imaging of Time-Variable Earthquake Hazard: The Hilton Creek Fault, Long Valley California

    Science.gov (United States)

    Hammond, W. C.; Blewitt, G.

    2016-12-01

    The Hilton Creek Fault, in Long Valley, California is a down-to-the-east normal fault that bounds the eastern edge of the Sierra Nevada/Great Valley microplate, and lies half inside and half outside the magmatically active caldera. Despite the dense coverage with GPS networks, the rapid and time-variable surface deformation attributable to sporadic magmatic inflation beneath the resurgent dome makes it difficult to use traditional geodetic methods to estimate the slip rate of the fault. While geologic studies identify cumulative offset, constrain timing of past earthquakes, and constrain a Quaternary slip rate to within 1-5 mm/yr, it is not currently possible to use geologic data to evaluate how the potential for slip correlates with transient caldera inflation. To estimate time-variable seismic hazard of the fault we estimate its instantaneous slip rate from GPS data using a new set of algorithms for robust estimation of velocity and strain rate fields and fault slip rates. From the GPS time series, we use the robust MIDAS algorithm to obtain time series of velocity that are highly insensitive to the effects of seasonality, outliers and steps in the data. We then use robust imaging of the velocity field to estimate a gridded time variable velocity field. Then we estimate fault slip rate at each time using a new technique that forms ad-hoc block representations that honor fault geometries, network complexity, connectivity, but does not require labor-intensive drawing of block boundaries. The results are compared to other slip rate estimates that have implications for hazard over different time scales. Time invariant long term seismic hazard is proportional to the long term slip rate accessible from geologic data. Contemporary time-invariant hazard, however, may differ from the long term rate, and is estimated from the geodetic velocity field that has been corrected for the effects of magmatic inflation in the caldera using a published model of a dipping ellipsoidal

  6. Comparison Of Seismic Performance Of Erciş Cultural Center Building With Observed And Calculated By Turkish Earthquake Code-2007

    Directory of Open Access Journals (Sweden)

    Recep Ali Dedecan

    2013-08-01

    Full Text Available The goal of this paper is to review the validity of seismic assessment procedure given in the Turkish Earthquake Code by comparing the assessment results with real structures from Eastern Turkey, where the 2011 Van earthquake occurred. To test the analysis methods for a typically suitable structure, the cultural center building at Erciş with 3 stories, is selected. In order to compare the results of the three different analysis techniques, for an identical earthquake, the ground motion used in analysis was characterized by equivalent elastic earthquake spectra, which were developed from available time history at the nearest construction site. It was found that the damage predictions by using the by Turkish Earthquake Code procedures point out the different level of damages. But, it is concluded that nonlinear time history analysis calculated the best estimation of the damage observed in the site.

  7. Seismotectonics of the 2010 El Mayor Cucapah - Indiviso Earthquake and its Relation to Seismic Hazard in Southern California

    Science.gov (United States)

    Gonzalez-Garcia, J. J.; Gonzalez Ortega, A.; Bock, Y.; Fialko, Y.; Fielding, E. J.; Fletcher, J. M.; Galetzka, J. E.; Hudnut, K. W.; Munguia, L.; Nelson, S. M.; Rockwell, T. K.; Sandwell, D. T.; Stock, J.

    2010-12-01

    The April 4th, 2010 Mw 7.2 earthquake was the largest earthquake in over 100 years of known historical seismicity in the Salton Trough region. It was a relatively benign earthquake, with only two deaths related to its occurrence. It produced, however, profound agricultural and ecological changes at the southern section of the Mexicali Valley, where a new fault called the Indiviso fault, is shown to have ruptured by analysis of ALOS PALSAR and Landsat imagery. The Indiviso fault connects the ridge-transform and continental transform tectonic regimes with a straight linkage, as revealed by this earthquake, but this event also simultaneously involved oblique normal faulting and cross-faulting. The earthquake was complex, with at least three distinct slip pulses. It originated as a normal rupture along the ~18 km long, N-S striking El Mayor-Hardy fault along the east margin of the Sierra El Mayor. After 10 seconds, two large bursts of energy were released, one to the NW and one to the SE, producing the total moment release equivalent to Mw 7.25. The NW ruptures reactivated portions of the Pescadores, Borrego and Paso Superior faults with minor slip along the Laguna Salada and several other faults. This section had a dominant right lateral strike slip sense of motion with the NE side down. To the SE of the epicenter, disruption occurred along the dominantly strike-slip Indiviso fault, with a SW side down component of dip slip. The epicentral aftershock area, including its main aftershock to the NW, is >120 km in length; the surficial faulting occurs along ~110 km with 6-9 km of splaying to the N-NE at the NW end and to the S-SW at the SE end. The El Mayor Cucapah - Indiviso event follows nine M>6.5 earthquakes along the San Andreas fault system in the past 80 years between the head of the Gulf of California and the Transverse Ranges. Long, straight fault segments capable of larger earthquakes, and that have not ruptured historically, include portions of the San Jacinto

  8. Effects of November 8, 1980 earthquake on Humboldt Bay Power Plant and Eureka, California area. Reconnaissance report 13 Nov-14 Nov 80

    International Nuclear Information System (INIS)

    Herring, K.S.; Rooney, V.; Chokshi, N.C.

    1981-06-01

    On November 8, 1980, an earthquake of a reported surface wave magnitude of 7.0 occurred off the coast of California, west of Eureka and the Humboldt Bay Power Plant. Three NRC staff members visited the site the following week to survey any damage associated with the earthquake, with the objective of using collected data to assist the NRR staff in ongoing seismic evaluations of older operating nuclear power plant facilities. This report contains their observations. They concluded that the effects of the earthquake on Humboldt Bay Power Plant Unit 3 were minimal and did not endanger the health and safety of the public. They recommended that improvements be made to seismic recording equipment and that generic preparation for future post-earthquake reconnaissance trips be made before the actual occurrence of earthquakes

  9. 3-D P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    Science.gov (United States)

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-01-01

    To refine the 3-D seismic velocity model in the greater Parkfield, California region, a new data set including regular earthquakes, shots, quarry blasts and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time–frequency domain phase weighted stacking method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behaviour. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new autopicker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. We relocate LFE families and analyse the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  10. Kinematics of the 2015 San Ramon, California earthquake swarm: Implications for fault zone structure and driving mechanisms

    Science.gov (United States)

    Xue, Lian; Bürgmann, Roland; Shelly, David R.; Johnson, Christopher W.; Taira, Taka'aki

    2018-05-01

    Earthquake swarms represent a sudden increase in seismicity that may indicate a heterogeneous fault-zone, the involvement of crustal fluids and/or slow fault slip. Swarms sometimes precede major earthquake ruptures. An earthquake swarm occurred in October 2015 near San Ramon, California in an extensional right step-over region between the northern Calaveras Fault and the Concord-Mt. Diablo fault zone, which has hosted ten major swarms since 1970. The 2015 San Ramon swarm is examined here from 11 October through 18 November using template matching analysis. The relocated seismicity catalog contains ∼4000 events with magnitudes between - 0.2 swarm illuminated three sub-parallel, southwest striking and northwest dipping fault segments of km-scale dimension and thickness of up to 200 m. The segments contain coexisting populations of different focal-mechanisms, suggesting a complex fault zone structure with several sets of en échelon fault orientations. The migration of events along the three planar structures indicates a complex fluid and faulting interaction processes. We searched for correlations between seismic activity and tidal stresses and found some suggestive features, but nothing that we can be confident is statistically significant.

  11. A prototype operational earthquake loss model for California based on UCERF3-ETAS – A first look at valuation

    Science.gov (United States)

    Field, Edward; Porter, Keith; Milner, Kevn

    2017-01-01

    We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

  12. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    Science.gov (United States)

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  13. School Site Preparedness for the Safety of California's Children K-12. Official Report of the Northridge Earthquake Task Force on Education.

    Science.gov (United States)

    California State Legislature, Sacramento. Senate Select Committee on the Northridge Earthquake.

    This report asserts that disaster preparedness at all school sites must become a major and immediate priority. Should a disaster equaling the magnitude of the Northridge earthquake occur, the current varying levels of site preparedness may not adequately protect California's children. The report describes why the state's children are not safe and…

  14. The Redwood Coast Tsunami Work Group: a unique organization promoting earthquake and tsunami resilience on California's North Coast

    Science.gov (United States)

    Dengler, L.; Henderson, C.; Larkin, D.; Nicolini, T.; Ozaki, V.

    2012-12-01

    The Northern California counties of Del Norte, Humboldt, and Mendocino account for over 30% of California's coastline and is one of the most seismically active areas of the contiguous 48 states. The region is at risk from earthquakes located on- and offshore and from tsunamis generated locally from faults associated with the Cascadia subduction zone (CSZ) and from distant sources elsewhere in the Pacific. In 1995 the California Geological Survey (CGS) published a scenario for a CSZ earthquake that included both strong ground shaking effects and a tsunami. As a result of the scenario, the Redwood Coast Tsunami Work Group (RCTWG), an organization of government agencies, tribes, service groups, academia and the private sector, was formed to coordinate and promote earthquake and tsunami hazard awareness and mitigation in the three-county region. The RCTWG and its member agencies projects include education/outreach products and programs, tsunami hazard mapping, signage and siren planning. Since 2008, RCTWG has worked with the California Emergency Management Agency (Cal EMA) in conducting tsunami warning communications tests on the North Coast. In 2007, RCTWG members helped develop and carry out the first tsunami training exercise at FEMA's Emergency Management Institute in Emmitsburg, MD. The RCTWG has facilitated numerous multi-agency, multi-discipline coordinated exercises, and RCTWG county tsunami response plans have been a model for other regions of the state and country. Eight North Coast communities have been recognized as TsunamiReady by the National Weather Service, including the first National Park the first State Park and only tribe in California to be so recognized. Over 500 tsunami hazard zone signs have been posted in the RCTWG region since 2008. Eight assessment surveys from 1993 to 2010 have tracked preparedness actions and personal awareness of earthquake and tsunami hazards in the county and additional surveys have tracked public awareness and tourist

  15. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    Science.gov (United States)

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  16. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  17. California Geriatric Education Center Logic Model: An Evaluation and Communication Tool

    Science.gov (United States)

    Price, Rachel M.; Alkema, Gretchen E.; Frank, Janet C.

    2009-01-01

    A logic model is a communications tool that graphically represents a program's resources, activities, priority target audiences for change, and the anticipated outcomes. This article describes the logic model development process undertaken by the California Geriatric Education Center in spring 2008. The CGEC is one of 48 Geriatric Education…

  18. Fluid-faulting interactions: Fracture-mesh and fault-valve behavior in the February 2014 Mammoth Mountain, California, earthquake swarm

    Science.gov (United States)

    Shelly, David R.; Taira, Taka’aki; Prejean, Stephanie; Hill, David P.; Dreger, Douglas S.

    2015-01-01

    Faulting and fluid transport in the subsurface are highly coupled processes, which may manifest seismically as earthquake swarms. A swarm in February 2014 beneath densely monitored Mammoth Mountain, California, provides an opportunity to witness these interactions in high resolution. Toward this goal, we employ massive waveform-correlation-based event detection and relative relocation, which quadruples the swarm catalog to more than 6000 earthquakes and produces high-precision locations even for very small events. The swarm's main seismic zone forms a distributed fracture mesh, with individual faults activated in short earthquake bursts. The largest event of the sequence, M 3.1, apparently acted as a fault valve and was followed by a distinct wave of earthquakes propagating ~1 km westward from the updip edge of rupture, 1–2 h later. Late in the swarm, multiple small, shallower subsidiary faults activated with pronounced hypocenter migration, suggesting that a broader fluid pressure pulse propagated through the subsurface.

  19. Deformation from the 1989 Loma Prieta earthquake near the southwest margin of the Santa Clara Valley, California

    Science.gov (United States)

    Schmidt, Kevin M.; Ellen, Stephen D.; Peterson, David M.

    2014-01-01

    Damage to pavement and near-surface utility pipes, caused by the 17 October 1989, Loma Prieta earthquake, provides evidence for ground deformation in a 663 km2 area near the southwest margin of the Santa Clara Valley, California (USA). A total of 1427 damage sites, collected from more than 30 sources, are concentrated in four zones, three of which lie near previously mapped faults. In one of these zones, the channel lining of Los Gatos Creek, a 2-km-long concrete strip trending perpendicular to regional geologic structure, was broken by thrusts that were concentrated in two belts, each several tens of meters wide, separated by more than 300 m of relatively undeformed concrete.

  20. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Strong Ground Motion

    Science.gov (United States)

    Borcherdt, Roger D.

    1994-01-01

    Strong ground motion generated by the Loma Prieta, Calif., earthquake (MS~7.1) of October 17, 1989, resulted in at least 63 deaths, more than 3,757 injuries, and damage estimated to exceed $5.9 billion. Strong ground motion severely damaged critical lifelines (freeway overpasses, bridges, and pipelines), caused severe damage to poorly constructed buildings, and induced a significant number of ground failures associated with liquefaction and landsliding. It also caused a significant proportion of the damage and loss of life at distances as far as 100 km from the epicenter. Consequently, understanding the characteristics of the strong ground motion associated with the earthquake is fundamental to understanding the earthquake's devastating impact on society. The papers assembled in this chapter address this problem. Damage to vulnerable structures from the earthquake varied substantially with the distance from the causative fault and the type of underlying geologic deposits. Most of the damage and loss of life occurred in areas underlain by 'soft soil'. Quantifying these effects is important for understanding the tragic concentrations of damage in such areas as Santa Cruz and the Marina and Embarcadero Districts of San Francisco, and the failures of the San Francisco-Oakland Bay Bridge and the Interstate Highway 880 overpass. Most importantly, understanding these effects is a necessary prerequisite for improving mitigation measures for larger earthquakes likely to occur much closer to densely urbanized areas in the San Francisco Bay region. The earthquake generated an especially important data set for understanding variations in the severity of strong ground motion. Instrumental strong-motion recordings were obtained at 131 sites located from about 6 to 175 km from the rupture zone. This set of recordings, the largest yet collected for an event of this size, was obtained from sites on various geologic deposits, including a unique set on 'soft soil' deposits

  1. The 2015 Fillmore earthquake swarm and possible crustal deformation mechanisms near the bottom of the eastern Ventura Basin, California

    Science.gov (United States)

    Hauksson, Egill; Andrews, Jennifer; Plesch, Andreas; Shaw, John H.; Shelly, David R.

    2016-01-01

    The 2015 Fillmore swarm occurred about 6 km west of the city of Fillmore in Ventura, California, and was located beneath the eastern part of the actively subsiding Ventura basin at depths from 11.8 to 13.8 km, similar to two previous swarms in the area. Template‐matching event detection showed that it started on 5 July 2015 at 2:21 UTC with an M∼1.0 earthquake. The swarm exhibited unusual episodic spatial and temporal migrations and unusual diversity in the nodal planes of the focal mechanisms as compared to the simple hypocenter‐defined plane. It was also noteworthy because it consisted of >1400 events of M≥0.0, with M 2.8 being the largest event. We suggest that fluids released by metamorphic dehydration processes, migration of fluids along a detachment zone, and cascading asperity failures caused this prolific earthquake swarm, but other mechanisms (such as simple mainshock–aftershock stress triggering or a regional aseismic creep event) are less likely. Dilatant strengthening may be a mechanism that causes the temporal decay of the swarm as pore‐pressure drop increased the effective normal stress, and counteracted the instability driving the swarm.

  2. Site response, shallow shear-wave velocity, and damage in Los Gatos, California, from the 1989 Loma Prieta earthquake

    Science.gov (United States)

    Hartzell, S.; Carver, D.; Williams, R.A.

    2001-01-01

    Aftershock records of the 1989 Loma Prieta earthquake are used to calculate site response in the frequency band of 0.5-10 Hz at 24 locations in Los Gatos, California, on the edge of the Santa Clara Valley. Two different methods are used: spectral ratios relative to a reference site on rock and a source/site spectral inversion method. These two methods complement each other and give consistent results. Site amplification factors are compared with surficial geology, thickness of alluvium, shallow shear-wave velocity measurements, and ground deformation and structural damage resulting from the Loma Prieta earthquake. Higher values of site amplification are seen on Quaternary alluvium compared with older Miocene and Cretaceous units of Monterey and Franciscan Formation. However, other more detailed correlations with surficial geology are not evident. A complex pattern of alluvial sediment thickness, caused by crosscutting thrust faults, is interpreted as contributing to the variability in site response and the presence of spectral resonance peaks between 2 and 7 Hz at some sites. Within the range of our field measurements, there is a correlation between lower average shear-wave velocity of the top 30 m and 50% higher values of site amplification. An area of residential homes thrown from their foundations correlates with high site response. This damage may also have been aggravated by local ground deformation. Severe damage to commercial buildings in the business district, however, is attributed to poor masonry construction.

  3. Direct and indirect evidence for earthquakes; an example from the Lake Tahoe Basin, California-Nevada

    Science.gov (United States)

    Maloney, J. M.; Noble, P. J.; Driscoll, N. W.; Kent, G.; Schmauder, G. C.

    2012-12-01

    High-resolution seismic CHIRP data can image direct evidence of earthquakes (i.e., offset strata) beneath lakes and the ocean. Nevertheless, direct evidence often is not imaged due to conditions such as gas in the sediments, or steep basement topography. In these cases, indirect evidence for earthquakes (i.e., debris flows) may provide insight into the paleoseismic record. The four sub-basins of the tectonically active Lake Tahoe Basin provide an ideal opportunity to image direct evidence for earthquake deformation and compare it to indirect earthquake proxies. We present results from high-resolution seismic CHIRP surveys in Emerald Bay, Fallen Leaf Lake, and Cascade Lake to constrain the recurrence interval on the West Tahoe Dollar Point Fault (WTDPF), which was previously identified as potentially the most hazardous fault in the Lake Tahoe Basin. Recently collected CHIRP profiles beneath Fallen Leaf Lake image slide deposits that appear synchronous with slides in other sub-basins. The temporal correlation of slides between multiple basins suggests triggering by events on the WTDPF. If correct, we postulate a recurrence interval for the WTDPF of ~3-4 k.y., indicating that the WTDPF is near its seismic recurrence cycle. In addition, CHIRP data beneath Cascade Lake image strands of the WTDPF that offset the lakefloor as much as ~7 m. The Cascade Lake data combined with onshore LiDAR allowed us to map the geometry of the WTDPF continuously across the southern Lake Tahoe Basin and yielded an improved geohazard assessment.

  4. Displacement Patterns of Cemetery Monuments in Ferndale, CA, During the MW 6.5 Offshore Northern California Earthquake of January 10, 2010

    Science.gov (United States)

    French, K. S.; Cashman, S. M.; Structural Geology Class Spring 2010

    2010-12-01

    Displaced and toppled monuments in a cemetery are an effective means of assessing local ground motion during an earthquake. The MW 6.5 Offshore Northern California earthquake of January 10, 2010, was felt throughout northwestern California and caused moderate damage in coastal communities between Petrolia and Eureka. The earthquake was generated by left-lateral strike slip on a NE-trending fault within the subducting Gorda plate. Peak horizontal ground accelerations of -0.440g (E) and 0.279g (N) and vertical ground acceleration of -0.122g (up) were recorded in Ferndale, CA, on the North American plate 37km east southeast of the epicenter. We measured displaced and toppled monuments in the Ferndale cemetery as a means of assessing ground motion during the January 10, 2010 Offshore Northern California earthquake. The cemetery occupies a hillside that slopes gently to the northwest, and a dormant landslide underlies the cemetery. Approximately 30% of the monuments were displaced during the earthquake. Affects included toppled columns and urns; headstones, columns and large tomb covers that slid and rotated and relative to monument bases; tilted retaining walls and headstones; and liquefaction-related settling (or, less commonly, uplift) of monuments. We measured translation and rotation of 79 monuments displaced from their bases during the earthquake. Toppled monuments do not display a preferred orientation. Seven of the 18 toppled monuments fell to the southeast, but toppling occurred in all directions. For monuments that were displaced but not toppled, 1-10 cm of northwestward translation and 3-8° of clockwise rotation were most common; however, virtually all directions of translation and both clockwise and counterclockwise rotations and were recorded. Damage was not evenly distributed geographically. In general, damage was concentrated in the northern, topographically lower, part of the cemetery. Counterclockwise rotation of monuments occurred mainly along the

  5. Preliminary analysis of strong-motion recordings from the 28 September 2004 Parkfield, California earthquake

    Science.gov (United States)

    Shakal, A.; Graizer, V.; Huang, M.; Borcherdt, R.; Haddadi, H.; Lin, K.-W.; Stephens, C.; Roffers, P.

    2005-01-01

    The Parkfield 2004 earthquake yielded the most extensive set of strong-motion data in the near-source region of a magnitude 6 earthquake yet obtained. The recordings of acceleration and volumetric strain provide an unprecedented document of the near-source seismic radiation for a moderate earthquake. The spatial density of the measurements alon g the fault zone and in the linear arrays perpendicular to the fault is expected to provide an exceptional opportunity to develop improved models of the rupture process. The closely spaced measurements should help infer the temporal and spatial distribution of the rupture process at much higher resolution than previously possible. Preliminary analyses of the peak a cceleration data presented herein shows that the motions vary significantly along the rupture zone, from 0.13 g to more than 2.5 g, with a map of the values showing that the larger values are concentrated in three areas. Particle motions at the near-fault stations are consistent with bilateral rupture. Fault-normal pulses similar to those observed in recent strike-slip earthquakes are apparent at several of the stations. The attenuation of peak ground acceleration with distance is more rapid than that indicated by some standard relationships but adequately fits others. Evidence for directivity in the peak acceleration data is not strong. Several stations very near, or over, the rupturing fault recorded relatively low accelerations. These recordings may provide a quantitative basis to understand observations of low near-fault shaking damage that has been reported in other large strike-slip earthquak.

  6. Vegetation studies, National Training Center, Fort Irwin, California

    Energy Technology Data Exchange (ETDEWEB)

    Brandt, C.A.; Rickard, W.H.; Cadoret, N.A.

    1997-09-01

    During the spring of 1992, the Pacific Northwest National Laboratory (PNNL) conducted surveys of the Avawatz and Granite mountains springs for the National Training Center (NTC) to evaluate the occurrence of sensitive plant species in these areas. PNNL also conducted a survey of the eastern outwash of the Paradise Range for the occurrence of Lane Mountain milk vetch (Astragalus jaegerianus). In spring of 1993, PNNL conducted an additional study of Lane Mountain milk vetch on the NTC to determine habitat characteristics for this plant and to develop a method for predicting its potential occurrence, based on simple habitat attributes. The results of these studies are itemized.

  7. The Dense GPS Array in Southern California: A New Tool for Seismic Hazard Assessment

    Science.gov (United States)

    Donnellan, A.; Hurst, K.; Scheid, J.; Watkins, M.; Webb, F.

    1995-01-01

    The Jet Propulsion Laboratory (JPL) and other institutions under the umbrella of the Southern California Earthquake Center (SCEC) are implementing a continuously operating dense GPS array in greater Los Angeles to measure movement along fault lines.

  8. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  9. Slip rate on the San Diego trough fault zone, inner California Borderland, and the 1986 Oceanside earthquake swarm revisited

    Science.gov (United States)

    Ryan, Holly F.; Conrad, James E.; Paull, C.K.; McGann, Mary

    2012-01-01

    The San Diego trough fault zone (SDTFZ) is part of a 90-km-wide zone of faults within the inner California Borderland that accommodates motion between the Pacific and North American plates. Along with most faults offshore southern California, the slip rate and paleoseismic history of the SDTFZ are unknown. We present new seismic reflection data that show that the fault zone steps across a 5-km-wide stepover to continue for an additional 60 km north of its previously mapped extent. The 1986 Oceanside earthquake swarm is located within the 20-km-long restraining stepover. Farther north, at the latitude of Santa Catalina Island, the SDTFZ bends 20° to the west and may be linked via a complex zone of folds with the San Pedro basin fault zone (SPBFZ). In a cooperative program between the U.S. Geological Survey (USGS) and the Monterey Bay Aquarium Research Institute (MBARI), we measure and date the coseismic offset of a submarine channel that intersects the fault zone near the SDTFZ–SPBFZ junction. We estimate a horizontal slip rate of about 1:5 0:3 mm=yr over the past 12,270 yr.

  10. Changes in state of stress on the southern san andreas fault resulting from the california earthquake sequence of april to june 1992.

    Science.gov (United States)

    Jaumé, S C; Sykes, L R

    1992-11-20

    The April to June 1992 Landers earthquake sequence in southern California modified the state of stress along nearby segments of the San Andreas fault, causing a 50-kilometer segment of the fault to move significantly closer to failure where it passes through a compressional bend near San Gorgonio Pass. The decrease in compressive normal stress may also have reduced fluid pressures along that fault segment. As pressures are reequilibrated by diffusion, that fault segment should move closer to failure with time. That fault segment and another to the southeast probably have not ruptured in a great earthquake in about 300 years.

  11. Postseismic relaxation following the 1994 Mw6.7 Northridge earthquake, southern California

    Science.gov (United States)

    Savage, J.C.; Svarc, J.L.

    2010-01-01

    We have reexamined the postearthquake deformation of a 65 km long linear array of 11 geodetic monuments extending north–south across the rupture (reverse slip on a blind thrust dipping 40°S–20°W) associated with the 1994 Mw6.7 Northridge earthquake. That array was surveyed frequently in the interval from 4 to 2650 days after the earthquake. The velocity of each of the monuments over the interval 100–2650 days postearthquake appears to be constant. Moreover, the profile of those velocities along the length of the array is very similar to a preearthquake velocity profile for a nearby, similarly oriented array. We take this to indicate that significant postseismic relaxation is evident only in the first 100 days postseismic and that the subsequent linear trend is typical of the interseismic interval. The postseismic relaxation (postseismic displacement less displacement that would have occurred at the preseismic velocity) is found to be almost wholly parallel (N70°W) to the nearby (40 km) San Andreas Fault with only negligible relaxation in the direction of coseismic slip (N20°E) on the Northridge rupture. We suggest that the N70°W relaxation is caused by aseismic, right-lateral slip at depth on the San Andreas Fault, excess slip presumably triggered by the Northridge rupture. Finally, using the Dieterich (1994) stress-seismicity relation, we show that return to the preseismic deformation rate within 100 days following the earthquake could be consistent with the cumulative number of M > 2.5 earthquakes observed following the main shock.

  12. Loss estimates for a Puente Hills blind-thrust earthquake in Los Angeles, California

    Science.gov (United States)

    Field, E.H.; Seligson, H.A.; Gupta, N.; Gupta, V.; Jordan, T.H.; Campbell, K.W.

    2005-01-01

    Based on OpenSHA and HAZUS-MH, we present loss estimates for an earthquake rupture on the recently identified Puente Hills blind-thrust fault beneath Los Angeles. Given a range of possible magnitudes and ground motion models, and presuming a full fault rupture, we estimate the total economic loss to be between $82 and $252 billion. This range is not only considerably higher than a previous estimate of $69 billion, but also implies the event would be the costliest disaster in U.S. history. The analysis has also provided the following predictions: 3,000-18,000 fatalities, 142,000-735,000 displaced households, 42,000-211,000 in need of short-term public shelter, and 30,000-99,000 tons of debris generated. Finally, we show that the choice of ground motion model can be more influential than the earthquake magnitude, and that reducing this epistemic uncertainty (e.g., via model improvement and/or rejection) could reduce the uncertainty of the loss estimates by up to a factor of two. We note that a full Puente Hills fault rupture is a rare event (once every ???3,000 years), and that other seismic sources pose significant risk as well. ?? 2005, Earthquake Engineering Research Institute.

  13. Does Geothermal Energy Production Cause Earthquakes in the Geysers Region of Northern California?

    Science.gov (United States)

    Grove, K.; Bailey, C.; Sotto, M.; Yu, M.; Cohen, M.

    2003-12-01

    The Geysers region is located in Sonoma County, several hours north of San Francisco. At this location, hot magma beneath the surface heats ground water and creates steam that is used to make electricity. Since 1997, 8 billion gallons of treated wastewater have been injected into the ground, where the water becomes hot and increases the amount of thermal energy that can be produced. Frequent micro-earthquakes (up to magnitude 4.5) occur in the region and seem to be related to the geothermal energy production. The region is mostly uninhabited, except for several small towns such as Anderson Springs, where people have been extremely concerned about potential damage to their property. The energy companies are planning to double the amount of wastewater injected into the ground and to increase their energy production. Geothermal energy is important because it is better for the environment than burning coal, oil, or gas. Air and water pollution, which have negative impacts on living things, are reduced compared to power plants that generate electricity by burning fossil fuels. We have studied the frequency and magnitude of earthquakes that have occurred in the region since the early 1970s and that are occurring today. We used software to analyze the earthquakes and to look for patterns related to water injection and energy production. We are interested in exploring ways that energy production can be continued without having negative impacts on the people in the region.

  14. Situated Preparedness: The Negotiation of a Future Catastrophic Earthquake in a California University

    Science.gov (United States)

    Baker, Natalie Danielle

    2013-01-01

    This dissertation examines disaster preparedness as engaged at a large university in southern California using inductive research and grounded theory data collection and analysis methods. The thesis consists of three parts, all addressing the problem of disaster preparedness as enacted in this at-risk context. I use in-depth interviews, archival…

  15. Great Western Savings Center - Beverly Hills, California (EE.UU.

    Directory of Open Access Journals (Sweden)

    William L. Pereira Asociados, Arquitectos

    1974-09-01

    Full Text Available This building which has an original elliptic plan form and is enclosed by curtain walls is situated on a strategic ground-site in Beverly Hills and is the savings center for the densely populated area of Los Angeles. The building consists of: four basements for parking; ground floor with entrances, halls and savings-bank; mezzanine floor and first floor with coffee-shop, four dining halls and auditorium, seven storeys with offices; the tenth floor is reserved for the financial department and the management section. The top part is occupied by the machine rooms of the six elevators. Tre structure consists of high tensile steel reinforced concrete and bronzed coloured glass enclosures. This is the highet building in Beverly Hills.Este edificio, de original planta elíptica y cerramiento a base de muros cortina, ocupa un solar estratégico en Beverly Hills y sirve como central de ahorro a toda la zona densamente poblada de Los Angeles. Consta de: cuatro sótanos para estacionamiento de vehículos, y planta baja con accesos, vestíbulos y caja de ahorros; entreplanta y planta primera con cafetería, cuatro comedores y auditorio; siete plantas de oficinas y planta décima destinada a albergar los despachos de dirección y departamento financiero, además del cuerpo superior, ocupado por los cuartos de máquinas de los seis aparatos elevadores. La estructura es de hormigón armado con acero de alta resistencia y cerramientos de vidrio color bronce. Es el edificio más alto de Beverly Hills.

  16. Along-strike variations in fault frictional properties along the San Andreas Fault near Cholame, California from joint earthquake and low-frequency earthquake relocations

    Science.gov (United States)

    Harrington, Rebecca M.; Cochran, Elizabeth S.; Griffiths, Emily M.; Zeng, Xiangfang; Thurber, Clifford H.

    2016-01-01

    Recent observations of low‐frequency earthquakes (LFEs) and tectonic tremor along the Parkfield–Cholame segment of the San Andreas fault suggest slow‐slip earthquakes occur in a transition zone between the shallow fault, which accommodates slip by a combination of aseismic creep and earthquakes (fault, which accommodates slip by stable sliding (>35  km depth). However, the spatial relationship between shallow earthquakes and LFEs remains unclear. Here, we present precise relocations of 34 earthquakes and 34 LFEs recorded during a temporary deployment of 13 broadband seismic stations from May 2010 to July 2011. We use the temporary array waveform data, along with data from permanent seismic stations and a new high‐resolution 3D velocity model, to illuminate the fine‐scale details of the seismicity distribution near Cholame and the relation to the distribution of LFEs. The depth of the boundary between earthquakes and LFE hypocenters changes along strike and roughly follows the 350°C isotherm, suggesting frictional behavior may be, in part, thermally controlled. We observe no overlap in the depth of earthquakes and LFEs, with an ∼5  km separation between the deepest earthquakes and shallowest LFEs. In addition, clustering in the relocated seismicity near the 2004 Mw 6.0 Parkfield earthquake hypocenter and near the northern boundary of the 1857 Mw 7.8 Fort Tejon rupture may highlight areas of frictional heterogeneities on the fault where earthquakes tend to nucleate.

  17. Interaction of the san jacinto and san andreas fault zones, southern california: triggered earthquake migration and coupled recurrence intervals.

    Science.gov (United States)

    Sanders, C O

    1993-05-14

    Two lines of evidence suggest that large earthquakes that occur on either the San Jacinto fault zone (SJFZ) or the San Andreas fault zone (SAFZ) may be triggered by large earthquakes that occur on the other. First, the great 1857 Fort Tejon earthquake in the SAFZ seems to have triggered a progressive sequence of earthquakes in the SJFZ. These earthquakes occurred at times and locations that are consistent with triggering by a strain pulse that propagated southeastward at a rate of 1.7 kilometers per year along the SJFZ after the 1857 earthquake. Second, the similarity in average recurrence intervals in the SJFZ (about 150 years) and in the Mojave segment of the SAFZ (132 years) suggests that large earthquakes in the northern SJFZ may stimulate the relatively frequent major earthquakes on the Mojave segment. Analysis of historic earthquake occurrence in the SJFZ suggests little likelihood of extended quiescence between earthquake sequences.

  18. Survey of strong motion earthquake effects on thermal power plants in California with emphasis on piping systems. Volume 1, Main report

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1995-11-01

    Since 1982, there has been a major effort expended to evaluate the susceptibility of nuclear Power plant equipment to failure and significant damage during seismic events. This was done by making use of data on the performance of electrical and mechanical equipment in conventional power plants and other similar industrial facilities during strong motion earthquakes. This report is intended as an extension of the seismic experience data collection effort and a compilation of experience data specific to power plant piping and supports designed and constructed US power piping code requirements which have experienced strong motion earthquakes. Eight damaging (Richter Magnitude 7.7 to 5.5) California earthquakes and their effects on 8 power generating facilities in use natural gas and California were reviewed. All of these facilities were visited and evaluated. Seven fossel-fueled (dual use natural gas and oil) and one nuclear fueled plants consisting of a total of 36 individual boiler or reactor units were investigated. Peak horizontal ground accelerations that either had been recorded on site at these facilities or were considered applicable to these power plants on the basis of nearby recordings ranged between 0.20g and 0.5lg with strong motion durations which varied from 3.5 to 15 seconds. Most US nuclear power plants are designed for a safe shutdown earthquake peak ground acceleration equal to 0.20g or less with strong motion durations which vary from 10 to 15 seconds

  19. Lack of continuity of the San Andreas Fault in southern California: Three-dimensional fault models and earthquake scenarios

    Science.gov (United States)

    Carena, Sara; Suppe, John; Kao, Honn

    2004-04-01

    The 1200-km-long San Andreas Fault loses its apparent continuity in southern California near San Gorgonio Pass [, 1957], which raises significant questions given the dominant role of this fault in active California tectonics. What is the fundamental three-dimensional (3-D) geometry and kinematic behavior of the San Andreas fault system in this complex region? Is a throughgoing, if complex, San Andreas rupture from the Mojave Desert to the Coachella Valley possible? We have explored the issue of 3-D continuity by mapping over 60 faults in this region to depths of 15-20 km from hypocenter locations and focal mechanisms. We were able to constrain the 3-D geometry of the San Andreas fault zone (SAF) near San Gorgonio Pass from the 3-D geometry of the fault network surrounding it. The most likely configuration is for the San Andreas Fault to merge into the shallow-dipping San Gorgonio Pass thrust northwest of Indio. We concluded that there is no direct continuity at present but rather a network of faults, and the only kind of rupture possible for the SAF in this region is a complex rupture, involving both strike-slip and reverse faulting. GPS measurements also suggest that despite the fact that large motions must have occurred in the past based on offset geologic markers, only minor motion is occurring today in this area. Applying our findings about the fault geometry, we explored several simple earthquake scenarios to determine the most favorable conditions for a throughgoing rupture of the San Andreas fault system from the Mojave Desert to the Coachella Valley.

  20. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California (ver. 2.0, January 2018)

    Science.gov (United States)

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-06-30

    As part of the U.S. Geological Survey’s (USGS) multi-hazards project in the Long Valley Caldera-Mono Lake area, the California Geological Survey (CGS) developed several earthquake scenarios and evaluated potential seismic hazards, including ground shaking, surface fault rupture, liquefaction, and landslide hazards associated with these earthquake scenarios. The results of these analyses can be useful in estimating the extent of potential damage and economic losses because of potential earthquakes and also for preparing emergency response plans.The Long Valley Caldera-Mono Lake area has numerous active faults. Five of these faults or fault zones are considered capable of producing magnitude ≥6.7 earthquakes according to the Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2) developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP) and the USGS National Seismic Hazard Mapping Program. These five faults are the Fish Slough, Hartley Springs, Hilton Creek, Mono Lake, and Round Valley Faults. CGS developed earthquake scenarios for these five faults in the study area and for the White Mountains Fault Zone to the east of the study area.In this report, an earthquake scenario is intended to depict the potential consequences of significant earthquakes. A scenario earthquake is not necessarily the largest or most damaging earthquake possible on a recognized fault. Rather it is both large enough and likely enough that emergency planners should consider it in regional emergency response plans. In particular, the ground motion predicted for a given scenario earthquake does not represent a full probabilistic hazard assessment, and thus it does not provide the basis for hazard zoning and earthquake-resistant building design.Earthquake scenarios presented here are based on fault geometry and activity data developed by the WGCEP, and are consistent with the 2008 Update of the United States National Seismic Hazard Maps (NSHM). Alternatives

  1. Non-double-couple earthquake mechanisms at the Geysers Geothermal Area, California

    Science.gov (United States)

    Ross, Alwyn; Foulger, G. R.; Julian, Bruce R.

    Inverting P- and S-wave polarities and P:SH amplitude ratios using linear programming methods suggests that about 20% of earthquakes at The Geysers geothermal area have significantly non-double-couple focal mechanisms, with explosive volumetric components as large as 33% of the seismic moment. This conclusion contrasts with those of earlier studies, which interpreted data in terms of double couples. The non-double-couple mechanisms are consistent with combined shear and tensile faulting, possibly caused by industrial water injection. Implosive mechanisms, which might be expected because of rapid steam withdrawal, have not been found. Significant compensated-linear-vector-dipole (CLVD) components in some mechanisms may indicate rapid fluid flow accompanying crack opening.

  2. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    Science.gov (United States)

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful.

  3. Analysis of Injection-Induced Micro-Earthquakes in a Geothermal Steam Reservoir, The Geysers Geothermal Field, California

    Energy Technology Data Exchange (ETDEWEB)

    Rutqvist, Jonny; Rutqvist, J.; Oldenburg, C.M.

    2008-05-15

    In this study we analyze relative contributions to the cause and mechanism of injection-induced micro-earthquakes (MEQs) at The Geysers geothermal field, California. We estimated the potential for inducing seismicity by coupled thermal-hydrological-mechanical analysis of the geothermal steam production and cold water injection to calculate changes in stress (in time and space) and investigated if those changes could induce a rock mechanical failure and associated MEQs. An important aspect of the analysis is the concept of a rock mass that is critically stressed for shear failure. This means that shear stress in the region is near the rock-mass frictional strength, and therefore very small perturbations of the stress field can trigger an MEQ. Our analysis shows that the most important cause for injection-induced MEQs at The Geysers is cooling and associated thermal-elastic shrinkage of the rock around the injected fluid that changes the stress state in such a way that mechanical failure and seismicity can be induced. Specifically, the cooling shrinkage results in unloading and associated loss of shear strength in critically shear-stressed fractures, which are then reactivated. Thus, our analysis shows that cooling-induced shear slip along fractures is the dominant mechanism of injection-induced MEQs at The Geysers.

  4. Earthquake strikes at China's energy centers. [Tangshan, July 1976

    Energy Technology Data Exchange (ETDEWEB)

    Smil, V.

    1976-12-01

    The earthquake that struck Hopei province in China on July 28, 1976 must have caused damage that will have wide repercussions for a long time. It came at the beginning of a new Five-Year Plan and struck one of the country's key industrial centers, Tangshan, a city of one million people on the western edge of the 2275 km/sup 2/ Kailvan coalfield. In reveiwing statistics on mining operations in that area, it is known that, after the stagnation of the early 1960s, the output had been growing by an average of more than one million tons each year since 1966. In 1971 a decision was made to double the designed capacity in five years, and a variety of technical innovations and organizational improvements has been undertaken in all Kailvan mines. Damage was reported heavy in Tientsin, China's third largest city, a major power generation center and the site of a new petrochemical complex. Takang, one of China's giant oil fields, producing currently about five percent of the country's crude oil, is located in the Tientsin municipality on the shores of Po Hai. Chinwangtao, some 120 km from the epicenter, is an important oil terminal for the shipments of Taching crude oil and the starting point of the final section of the Taching-Peking pipeline, which supplies the capital's huge Tungfanghung petrochemical complex. Damage is not known to this pipeline or to the extensive high-voltage grid in the area. (MCW)

  5. The Relationship between Knowledge and Attitude of Managers with Preparedness of Healthcare Centers in Rey Health Network against Earthquake Risk - 2013

    Directory of Open Access Journals (Sweden)

    Mohammad Asadzadeh

    2014-06-01

    Conclusions: Considering that managers’ knowledge was rather low, preparedness among centers was low as well. According to low knowledge and unsuitable preparedness, more theoretical and practical trainings and maneuvers were necessary to be held for managers about earthquake preparedness.

  6. Hippotherapy: Remuneration issues impair the offering of this therapeutic strategy at Southern California rehabilitation centers.

    Science.gov (United States)

    Pham, Christine; Bitonte, Robert

    2016-04-06

    Hippotherapy is the use of equine movement in physical, occupational, or speech therapy in order to obtain functional improvements in patients. Studies show improvement in motor function and sensory processing for patients with a variety of neuromuscular disabilities, developmental disorders, or skeletal impairments as a result of using hippotherapy. The primary objective of this study is to identify the pervasiveness of hippotherapy in Southern California, and any factors that impair its utilization. One hundred and fifty-two rehabilitation centers in the Southern California counties of Los Angeles, San Diego, Orange, Riverside, San Bernardino, San Diego, San Luis Obispo, Santa Barbara, Ventura, and Kern County were identified, and surveyed to ascertain if hippotherapy is utilized, and if not, why not. Through a review of forty facilities that responded to our inquiry, our study indicates that the majority of rehabilitation centers are familiar with hippotherapy, however, only seven have reported that hippotherapy is indeed available as an option in therapy at their centers. It is concluded that hippotherapy, used in a broad based array of physical and sensory disorders, is limited in its ability to be utilized, primarily due to remuneration issues.

  7. Environmental consequences of postulated plutonium releases from General Electric Company Vallecitos Nuclear Center, Vallecitos, California, as a result of severe natural phenomena

    International Nuclear Information System (INIS)

    Jamison, J.D.; Watson, E.C.

    1980-11-01

    Potential environmental consequences in terms of radiation dose to people are presented for postulated plutonium releases caused by severe natural phenomena at the General Electric Company Vallecitos Nuclear Center, Vallecitos, California. The severe natural phenomena considered are earthquakes, tornadoes, and high straight-line winds. Maximum plutonium deposition values are given for significant locations around the site. All important potential exposure pathways are examined. The most likely 50-year committed dose equivalents are given for the maximum-exposed individual and the population within a 50-mile radius of the plant. The maximum plutonium deposition values likely to occur offsite are also given. The most likely calculated 50-year collective committed dose equivalents are all much lower than the collective dose equivalent expected from 50 years of exposure to natural background radiation and medical x-rays. The most likely maximum residual plutonium contamination estimated to be deposited offsite following the earthquakes, and the 180-mph and 230-mph tornadoes are above the Environmental Protection Agency's (EPA) proposed guideline for plutonium in the general environment of 0.2 μCi/m 2 . The deposition values following the 135-mph tornado are below the EPA proposed guidelines

  8. Salmonella in California wildlife species: prevalence in rehabilitation centers and characterization of isolates.

    Science.gov (United States)

    Smith, Woutrina A; Mazet, Jonna A K; Hirsh, Dwight C

    2002-09-01

    Fecal samples from 212 selected marine mammals, marine birds, and raptors were cultured for Salmonella spp. on arrival at rehabilitation centers in California from May 1999 through July 2000. Salmonella spp. were cultured from nine (4%) animals, and seven serotypes were isolated: Johannesberg, Montevideo, Newport, Ohio, Saint Paul, Enteritidis Group D, and 4,5,12:1 Monophasic. One western gull (Larus occidentalis) had two serotypes. Antibiotic susceptibilities and chromosomal fingerprints were evaluated for Salmonella isolates. Some isolates were resistant to gentamicin, amoxicillin-clavulanic acid, and ampicillin. Chromosomal fingerprints with XbaI and XhoI restriction enzymes differed between serotypes but not between individuals carrying the same serotype of Salmonella.

  9. Impact of emergency medical services stroke routing protocols on Primary Stroke Center certification in California.

    Science.gov (United States)

    Schuberg, Sam; Song, Sarah; Saver, Jeffrey L; Mack, William J; Cen, Steven Y; Sanossian, Nerses

    2013-12-01

    Organized stroke systems of care include Primary Stroke Center (PSC) certification and preferential emergency medical services (EMS) routing of suspected patients with stroke to designated PSCs. Stroke EMS routing is not nationally governed; in California, routing is determined by county. EMS routing policies might provide an incentive for PSC accreditation. We evaluated the relationship between independent adoption of EMS routing protocols and PSC designation acquisition in California. Dates of PSC certification were obtained through The Joint Commissions Website and confirmatory calls to stroke coordinators. Starting date of county EMS PSC routing policies was obtained from county EMS agencies. We provide descriptive analysis of number of hospitals achieving PSC designation relative to implementation of EMS routing policies for all counties with PSCs. By June 2012, there were 131 California PSCs in 27 counties, and 22 of 58 counties had implemented EMS routing policies. The greatest number of PSCs was in Los Angeles (30) followed by San Diego (11), Orange (9), and Santa Clara (9) counties. Achievement of PSC designation occurred more frequently immediately before and after EMS routing: 51 PSCs (39%) within 1 year; 85 PSCs (65%) within 2 years. The yearly rate of eligible hospital conversion to PSC designation accelerated concurrent with EMS diversion policy adoption from 3.8% before to 16.2% during and decelerated afterward to 7.6%. Implementation of EMS routing policies may be an important factor driving PSC certification. National adoption of stroke routing policies may lead to more PSCs, positively impacting patient care.

  10. Presentation of the National Center for Research in Vocational Education [Berkeley, California] at the AVA Annual Conference.

    Science.gov (United States)

    National Center for Research in Vocational Education, Berkeley, CA.

    This collection contains the following conference presentations about the National Center for Research in Vocational Education at the University of California at Berkeley: "Visions and Principles" (Charles Benson); "How the Center Sees Its Role" (Gordon Swanson); "The Research Agenda" (Sue Berryman); "The Service…

  11. Seismic Response and Evaluation of SDOF Self-Centering Friction Damping Braces Subjected to Several Earthquake Ground Motions

    Directory of Open Access Journals (Sweden)

    Jong Wan Hu

    2015-01-01

    Full Text Available This paper mainly deals with seismic response and performance for self-centering friction damping braces (SFDBs subjected to several maximum- or design-leveled earthquake ground motions. The self-centering friction damping brace members consist of core recentering components fabricated with superelastic shape memory alloy wires and energy dissipation devices achieved through shear friction mechanism. As compared to the conventional brace members for use in the steel concentrically braced frame structure, these self-centering friction damping brace members make the best use of their representative characteristics to minimize residual deformations and to withstand earthquake loads without member replacement. The configuration and response mechanism of self-centering friction damping brace systems are firstly described in this study, and then parametric investigations are conducted through nonlinear time-history analyses performed on numerical single degree-of-freedom spring models. After observing analysis results, adequate design methodologies that optimally account for recentering capability and energy dissipation according to their comparative parameters are intended to be suggested in order to take advantage of energy capacity and to minimize residual deformation simultaneously.

  12. Final Report Feasibility Study for the California Wave Energy Test Center (CalWavesm)

    Energy Technology Data Exchange (ETDEWEB)

    Blakeslee, Samuel Norman [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States). Inst. for Advanced Technology and Public Policy; Toman, William I. [Protean Wave Energy Ltd., Los Osos, CA (United States); Williams, Richard B. [Leidos Maritime Solutions, Reston, VA (United States); Davy, Douglas M. [CH2M, Sacramento, CA (United States); West, Anna [Kearns and West, Inc., San Francisco, CA (United States); Connet, Randy M. [Omega Power Engineers, LLC, Anaheim, CA (United States); Thompson, Janet [Kearns and West, Inc., San Francisco, CA (United States); Dolan, Dale [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Baltimore, Craig [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Jacobson, Paul [Electric Power Research Inst. (EPRI), Knoxville, TN (United States); Hagerman, George [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Potter, Chris [California Natural Resources Agency, Sacramento, CA (United States); Dooher, Brendan [Pacific Gas and Electric Company, San Francisco, CA (United States); Wendt, Dean [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States); Sheppard, Colin [Humboldt State Univ., Arcata, CA (United States); Harris, Andrew [Humboldt State Univ., Arcata, CA (United States); Lawson, W. Graham [Power Delivery Consultants, Inc., Albany, NY (United States)

    2017-07-31

    The California Wave Energy Test Center (CalWave) Feasibility Study project was funded over multiple phases by the Department of Energy to perform an interdisciplinary feasibility assessment to analyze the engineering, permitting, and stakeholder requirements to establish an open water, fully energetic, grid connected, wave energy test center off the coast of California for the purposes of advancing U.S. wave energy research, development, and testing capabilities. Work under this grant included wave energy resource characterization, grid impact and interconnection requirements, port infrastructure and maritime industry capability/suitability to accommodate the industry at research, demonstration and commercial scale, and macro and micro siting considerations. CalWave Phase I performed a macro-siting and down-selection process focusing on two potential test sites in California: Humboldt Bay and Vandenberg Air Force Base. This work resulted in the Vandenberg Air Force Base site being chosen as the most favorable site based on a peer reviewed criteria matrix. CalWave Phase II focused on four siting location alternatives along the Vandenberg Air Force Base coastline and culminated with a final siting down-selection. Key outcomes from this work include completion of preliminary engineering and systems integration work, a robust turnkey cost estimate, shoreside and subsea hazards assessment, storm wave analysis, lessons learned reports from several maritime disciplines, test center benchmarking as compared to existing international test sites, analysis of existing applicable environmental literature, the completion of a preliminary regulatory, permitting and licensing roadmap, robust interaction and engagement with state and federal regulatory agency personnel and local stakeholders, and the population of a Draft Federal Energy Regulatory Commission (FERC) Preliminary Application Document (PAD). Analysis of existing offshore oil and gas infrastructure was also performed

  13. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  14. Assessing Threats and Conservation Status of Historical Centers of Oak Richness in California

    Directory of Open Access Journals (Sweden)

    Kelly Jane Easterday

    2016-12-01

    Full Text Available Oak trees are emblematic of California landscapes, they serve as keystone cultural and ecological species and as indicators of natural biological diversity. As historically undeveloped landscapes are increasingly converted to urban environments, endemic oak woodland extent is reduced, which underscores the importance of strategic placement and reintroduction of oaks and woodland landscape for the maintenance of biodiversity and reduction of habitat fragmentation. This paper investigated the effects of human urban development on oak species in California by first modeling historical patterns of richness for eight oak tree species using historical map and plot data from the California Vegetation Type Mapping (VTM collection. We then examined spatial intersections between hot spots of historical oak richness and modern urban and conservation lands and found that impacts from development and conservation vary by both species and richness. Our findings suggest that the impact of urban development on oaks has been small within the areas of highest oak richness but that areas of highest oak richness are also poorly conserved. Third, we argue that current policy measures are inadequate to conserve oak woodlands and suggest regions to prioritize acquisition of conservation lands as well as examine urban regions where historic centers of oak richness were lost as potential frontiers for oak reintroduction. We argue that urban planning could benefit from the adoption of historical data and modern species distribution modelling techniques primarily used in natural resources and conservation fields to better locate hot spots of species richness, understand where habitats and species have been lost historically and use this evidence as incentive to recover what was lost and preserve what still exists. This adoption of historical data and modern techniques would then serve as a paradigm shift in the way Urban Planners recognize, quantify, and use landscape

  15. AIDA – Seismic data acquisition, processing, storage and distribution at the National Earthquake Center, INGV

    Directory of Open Access Journals (Sweden)

    Salvatore Mazza

    2012-10-01

    Full Text Available On May 4, 2012, a new system, known as the AIDA (Advanced Information and Data Acquisition system for seismology, became operational as the primary tool to monitor, analyze, store and distribute seismograms from the Italian National Seismic Network. Only 16 days later, on May 20, 2012, northern Italy was struck by a Ml 5.9 earthquake that caused seven casualties. This was followed by numerous small to moderate earthquakes, with some over Ml 5. Then, on May 29, 2012, a Ml 5.8 earthquake resulted in 17 more victims and left about 14,000 people homeless. This sequence produced more than 2,100 events over 40 days, and it was still active at the end of June 2012, with minor earthquakes at a rate of about 20 events per day. The new AIDA data management system was designed and implemented, among other things, to exploit the recent huge upgrade of the Italian Seismic Network (in terms of the number and quality of stations and to overcome the limitations of the previous system.

  16. Tidal triggering of low frequency earthquakes near Parkfield, California: Implications for fault mechanics within the brittle-ductile transition

    Science.gov (United States)

    Thomas, A.M.; Burgmann, R.; Shelly, David R.; Beeler, Nicholas M.; Rudolph, M.L.

    2012-01-01

    Studies of nonvolcanic tremor (NVT) have established the significant impact of small stress perturbations on NVT generation. Here we analyze the influence of the solid earth and ocean tides on a catalog of ∼550,000 low frequency earthquakes (LFEs) distributed along a 150 km section of the San Andreas Fault centered at Parkfield. LFE families are identified in the NVT data on the basis of waveform similarity and are thought to represent small, effectively co-located earthquakes occurring on brittle asperities on an otherwise aseismic fault at depths of 16 to 30 km. We calculate the sensitivity of each of these 88 LFE families to the tidally induced right-lateral shear stress (RLSS), fault-normal stress (FNS), and their time derivatives and use the hypocentral locations of each family to map the spatial variability of this sensitivity. LFE occurrence is most strongly modulated by fluctuations in shear stress, with the majority of families demonstrating a correlation with RLSS at the 99% confidence level or above. Producing the observed LFE rate modulation in response to shear stress perturbations requires low effective stress in the LFE source region. There are substantial lateral and vertical variations in tidal shear stress sensitivity, which we interpret to reflect spatial variation in source region properties, such as friction and pore fluid pressure. Additionally, we find that highly episodic, shallow LFE families are generally less correlated with tidal stresses than their deeper, continuously active counterparts. The majority of families have weaker or insignificant correlation with positive (tensile) FNS. Two groups of families demonstrate a stronger correlation with fault-normal tension to the north and with compression to the south of Parkfield. The families that correlate with fault-normal clamping coincide with a releasing right bend in the surface fault trace and the LFE locations, suggesting that the San Andreas remains localized and contiguous down

  17. Trauma Center-Based Surveillance of Nontraffic Pedestrian Injury among California Children

    Directory of Open Access Journals (Sweden)

    John Sherck, MD

    2012-05-01

    Full Text Available Introduction: Every year in the United States, thousands of young children are injured by passengervehicles in driveways or parking areas. Little is known about risk factors, and incidence rates aredifficult to estimate because ascertainment using police collision reports or media sources isincomplete. This study used surveillance at trauma centers to identify incidents and parent interviewsto obtain detailed information on incidents, vehicles, and children.Methods: Eight California trauma centers conducted surveillance of nontraffic pedestrian collisioninjury to children aged 14 years or younger from January 2005 to July 2007. Three of these centersconducted follow-up interviews with family members.Results: Ninety-four injured children were identified. Nine children (10% suffered fatal injury. Seventychildren (74% were 4 years old or younger. Family members of 21 victims from this study (23%completed an interview. Of these 21 interviewed victims, 17 (81% were male and 13 (62% were 1 or 2years old. In 13 cases (62%, the child was backed over, and the driver was the mother or father in 11cases (52%. Fifteen cases (71% involved a sport utility vehicle, pickup truck, or van. Most collisionsoccurred in a residential driveway.Conclusion: Trauma center surveillance can be used for case ascertainment and for collectinginformation on circumstances of nontraffic pedestrian injuries. Adoption of a specific external cause-ofinjurycode would allow passive surveillance of these injuries. Research is needed to understand thecontributions of family, vehicular, and environmental characteristics and injury risk to inform preventionefforts.

  18. Reply to “Comment on “Should Memphis build for California's earthquakes?” From A.D. Frankel”

    Science.gov (United States)

    Stein, Seth; Tomasello, Joseph; Newman, Andrew

    Carl Sagan observed that “extraordinary claims require extraordinary evidence.” In our view, A.D. Frankel's arguments (see accompanying Comment piece) do not reach the level required to demonstrate the counter-intuitive propositions that the earthquake hazard in the New Madrid Seismic Zone (NMSZ) is comparable to that in coastal California, and that buildings should be built to similar standards.This interchange is the latest in an ongoing debate beginning with Newman et al.'s [1999a] recommendation, based on analysis of Global Positioning System and earthquake data, that Frankel et al.'s [1996] estimate of California-level seismic hazard for the NMSZ should be reduced. Most points at issue, except for those related to the costs and benefits of the proposed new International Building Code 2000, have already been argued at length by both sides in the literature [e.g.,Schweig et al., 1999; Newman et al., 1999b, 2001; Cramer, 2001]. Hence,rather than rehash these points, we will try here to provide readers not enmeshed in this morass with an overview of the primary differences between our view and that of Frankel.

  19. Experimental Study on a Self-Centering Earthquake-Resistant Masonry Pier with a Structural Concrete Column

    Directory of Open Access Journals (Sweden)

    Lijun Niu

    2017-01-01

    Full Text Available This paper proposes a slotting construction strategy to avoid shear behavior of multistory masonry buildings. The aspect ratio of masonry piers increases via slotting between spandrels and piers, so that the limit state of piers under an earthquake may be altered from shear to rocking. Rocking piers with a structural concrete column (SCC form a self-centering earthquake-resistant system. The in-plane lateral rocking behavior of masonry piers subjected to an axial force is predicted, and an experimental study is conducted on two full-scale masonry piers with an SCC, which consist of a slotting pier and an original pier. Meanwhile, a comparison of the rocking modes of masonry piers with an SCC and without an SCC was conducted in the paper. Experimental verification indicates that the slotting strategy achieves a change of failure modes from shear to rocking, and this resistant system with an SCC incorporates the self-centering and high energy dissipation properties. For the slotting pier, a lateral story drift ratio of 2.5% and a high displacement ductility of approximately 9.7 are obtained in the test, although the lateral strength decreased by 22.3% after slotting. The predicted lateral strength of the rocking pier with an SCC has a margin of error of 5.3%.

  20. Holocene slip rates along the San Andreas Fault System in the San Gorgonio Pass and implications for large earthquakes in southern California

    Science.gov (United States)

    Heermance, Richard V.; Yule, Doug

    2017-06-01

    The San Gorgonio Pass (SGP) in southern California contains a 40 km long region of structural complexity where the San Andreas Fault (SAF) bifurcates into a series of oblique-slip faults with unknown slip history. We combine new 10Be exposure ages (Qt4: 8600 (+2100, -2200) and Qt3: 5700 (+1400, -1900) years B.P.) and a radiocarbon age (1260 ± 60 years B.P.) from late Holocene terraces with scarp displacement of these surfaces to document a Holocene slip rate of 5.7 (+2.7, -1.5) mm/yr combined across two faults. Our preferred slip rate is 37-49% of the average slip rates along the SAF outside the SGP (i.e., Coachella Valley and San Bernardino sections) and implies that strain is transferred off the SAF in this area. Earthquakes here most likely occur in very large, throughgoing SAF events at a lower recurrence than elsewhere on the SAF, so that only approximately one third of SAF ruptures penetrate or originate in the pass.Plain Language SummaryHow large are earthquakes on the southern San Andreas Fault? The answer to this question depends on whether or not the earthquake is contained only along individual fault sections, such as the Coachella Valley section north of Palm Springs, or the rupture crosses multiple sections including the area through the San Gorgonio Pass. We have determined the age and offset of faulted stream deposits within the San Gorgonio Pass to document slip rates of these faults over the last 10,000 years. Our results indicate a long-term slip rate of 6 mm/yr, which is almost 1/2 of the rates east and west of this area. These new rates, combined with faulted geomorphic surfaces, imply that large magnitude earthquakes must occasionally rupture a 300 km length of the San Andreas Fault from the Salton Sea to the Mojave Desert. Although many ( 65%) earthquakes along the southern San Andreas Fault likely do not rupture through the pass, our new results suggest that large >Mw 7.5 earthquakes are possible on the southern San Andreas Fault and likely

  1. Analysis of the burns profile and the admission rate of severely burned adult patient to the National Burn Center of Chile after the 2010 earthquake.

    Science.gov (United States)

    Albornoz, Claudia; Villegas, Jorge; Sylvester, Marilu; Peña, Veronica; Bravo, Iside

    2011-06-01

    Chile is located in the Ring of Fire, in South America. An earthquake 8.8° affected 80% of the population in February 27th, 2010. This study was conducted to assess any change in burns profile caused by the earthquake. This was an ecologic study. We compared the 4 months following the earthquake in 2009 and 2010. age, TBSA, deep TBSA, agent, specific mortality rate and rate of admissions to the National burn Center of Chile. Mann-Whitney test and a Poisson regression were performed. Age, agent, TBSA and deep TBSA percentages did not show any difference. Mortality rate was lower in 2010 (0.52 versus 1.22 per 1,000,000 habitants) but no meaningful difference was found (Poisson regression p = 0.06). Admission rate was lower in 2010, 4.6 versus 5.6 per 1,000,000 habitants, but no differences were found (p = 0.26). There was not any admissions directly related to the earthquake. As we do not have incidence registries in Chile, we propose to use the rate of admission to the National Burn Reference Center as an incidence estimator. There was not any significant difference in the burn profile, probably because of the time of the earthquake (3 am). We conclude the earthquake did not affect the way the Chilean people get burned. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.

  2. Dust Climatology of the NASA Dryden Flight Research Center (DFRC in Lancaster, California, USA

    Directory of Open Access Journals (Sweden)

    Ashok K. Pokharel

    2017-02-01

    Full Text Available Abstract: A 15-year (1997–2011 climatology of dust events at the NASA DFRC in Lancaster, California, USA, was performed to evaluate how the extratropical systems were associated with dust storms over this region. For this study, we collected meteorological data for Edwards Air Force Base (EAFB in Lancaster, California, which is very close to NASA DFRC, from wunderground.com, National Centers for Environmental Prediction (NCEP/North American Regional Reanalysis (NARR, NCEP/Hydro-meteorological Prediction Center/National Weather Service (NWS, and Unisys analyses. We find that the dust events were associated with the development of a deep convective boundary layer, turbulence kinetic energy (TKE ≥3 J/kg, a deep unstable lapse rate layer, a wind speed above the frictional threshold wind speed necessary to ablate dust from the surface (≥7.3 m/s, a presence of a cold trough above the deep planetary boundary layer (PBL, a strong cyclonic jet, an influx of vertical sensible heat from the surrounding area, and a low volumetric soil moisture fraction <0.3. The annual mean number of dust events, their mean duration, and the unit duration per number of event for each visibility range, when binned as <11.2 km, <8 km, <4.8 km, <1.6 km, and <1 km were calculated. The visibility range values were positively correlated with the annual mean number of dust events, duration of dust events, and the ratio of duration of dust events. The percentage of the dust events by season shows that most of the dust events occurred in autumn (44.7%, followed by spring (38.3%, and equally in summer and winter with these seasons each accounting for 8.5% of events. This study also shows that the summer had the highest percentage (10% of the lowest visibility condition (<1 km followed by autumn (2%. Neither of the other two seasons—winter and spring—experienced such a low visibility condition during the entire dust events over 15 years. Winter had the highest visibility

  3. The response of academic medical centers to the 2010 Haiti earthquake: the Mount Sinai School of Medicine experience.

    Science.gov (United States)

    Ripp, Jonathan A; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response.

  4. Audio-based, unsupervised machine learning reveals cyclic changes in earthquake mechanisms in the Geysers geothermal field, California

    Science.gov (United States)

    Holtzman, B. K.; Paté, A.; Paisley, J.; Waldhauser, F.; Repetto, D.; Boschi, L.

    2017-12-01

    The earthquake process reflects complex interactions of stress, fracture and frictional properties. New machine learning methods reveal patterns in time-dependent spectral properties of seismic signals and enable identification of changes in faulting processes. Our methods are based closely on those developed for music information retrieval and voice recognition, using the spectrogram instead of the waveform directly. Unsupervised learning involves identification of patterns based on differences among signals without any additional information provided to the algorithm. Clustering of 46,000 earthquakes of $0.3

  5. California Earthquake Clearinghouse Crisis Information-Sharing Strategy in Support of Situational Awareness, Understanding Interdependencies of Critical Infrastructure, Regional Resilience, Preparedness, Risk Assessment/mitigation, Decision-Making and Everyday Operational Needs

    Science.gov (United States)

    Rosinski, A.; Morentz, J.; Beilin, P.

    2017-12-01

    The principal function of the California Earthquake Clearinghouse is to provide State and Federal disaster response managers, and the scientific and engineering communities, with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes and tsunamis. The overarching problem highlighted in discussions with Clearinghouse partners is the confusion and frustration of many of the Operational Area representatives, and some regional utilities throughout the state on what software applications they should be using and maintaining to meet State, Federal, and Local, requirements, and for what purposes, and how to deal with the limitations of these applications. This problem is getting in the way of making meaningful progress on developing multi-application interoperability and the necessary supporting cross-sector information-sharing procedures and dialogue on essential common operational information that entities need to share for different all hazards missions and related operational activities associated with continuity, security, and resilience. The XchangeCore based system the Clearinghouse is evolving helps deal with this problem, and does not compound it by introducing yet another end-user application; there is no end-user interface with which one views XchangeCore, all viewing of data provided through XchangeCore occurs in and on existing, third-party operational applications. The Clearinghouse efforts with XchangeCore are compatible with FEMA, which is currently using XchangeCore-provided data for regional and National Business Emergency Operations Center (source of business information sharing during emergencies) response. Also important, and should be emphasized, is that information-sharing is not just for response, but for preparedness, risk assessment/mitigation decision-making, and everyday operational needs for situational awareness. In other words, the benefits of the Clearinghouse

  6. Health Hazard Evaluation Report HETA 91-395-2244, Veterans Administration Medical Center, Los Angeles, California

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, J.E.; Miller, A.

    1992-08-01

    In response to a request from an employee of the Veterans Administration Medical Center (SIC-8062), Los Angeles, California, an investigation was undertaken of exposures to chemicals in the laboratory department, excessive heat and humidity in the kitchen area of the dietetics department, and carbon-monoxide (630080) exposures inside the building. In three of five personal breathing zone samples taken in the histopathology laboratory, formaldehyde (50000) was detected at concentrations up to 0.17 part per million (ppm) and it was also present in all four of the area air samples at concentrations up to 1.1ppm. The predominant symptoms associated with work in the laboratory included occasional headaches and nose/throat irritation. Mild episodes of dermal irritation and rash were also reported. All carbon-monoxide levels were less than 5ppm. In the kitchens, relative humidity levels were below the recommended range. Temperatures were above the range of temperatures recommended for a medium level of work. The authors conclude that a potential carcinogenic risk existed for workers in laboratories which use formaldehyde. The authors recommend specific measures to lower the risk of formaldehyde exposures in the laboratory.

  7. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  8. Transient stresses al Parkfield, California, produced by the M 7.4 Landers earthquake of June 28, 1992: implications for the time-dependence of fault friction

    Directory of Open Access Journals (Sweden)

    J. B. Fletcher

    1994-06-01

    Full Text Available he M 7.4 Landers earthquake triggered widespread seismicity in the Western U.S. Because the transient dynamic stresses induced at regional distances by the Landers surface waves are much larger than the expected static stresses, the magnitude and the characteristics of the dynamic stresses may bear upon the earthquake triggering mechanism. The Landers earthquake was recorded on the UPSAR array, a group of 14 triaxial accelerometers located within a 1-square-km region 10 km southwest of the town of Parkfield, California, 412 km northwest of the Landers epicenter. We used a standard geodetic inversion procedure to determine the surface strain and stress tensors as functions of time from the observed dynamic displacements. Peak dynamic strains and stresses at the Earth's surface are about 7 microstrain and 0.035 MPa, respectively, and they have a flat amplitude spectrum between 2 s and 15 s period. These stresses agree well with stresses predicted from a simple rule of thumb based upon the ground velocity spectrum observed at a single station. Peak stresses ranged from about 0.035 MPa at the surface to about 0.12 MPa between 2 and 14 km depth, with the sharp increase of stress away from the surface resulting from the rapid increase of rigidity with depth and from the influence of surface wave mode shapes. Comparison of Landers-induced static and dynamic stresses at the hypocenter of the Big Bear aftershock provides a clear example that faults are stronger on time scales of tens of seconds than on time scales of hours or longer.

  9. Incorporating fault mechanics into inversions of aftershock data for the regional remote stress, with application to the 1992 Landers, California earthquake

    Science.gov (United States)

    Maerten, Frantz; Madden, Elizabeth H.; Pollard, David D.; Maerten, Laurent

    2016-04-01

    We present a new stress inversion algorithm that accounts for the physics relating the remote stress, slip along complex faults, and aftershock focal mechanisms, in a linear-elastic, heterogeneous, isotropic whole- or half-space. For each new remote stress, the solution of the simulation is obtained by the superposition of three pre-calculated solutions, leading to a constant time evaluation. Consequently, the full three-dimensional boundary element method model need not be recomputed and is independent of the structural complexity of the underlying model. Using a synthetic model, we evaluate several different measures of fit, or cost functions, between aftershocks and model results. Cost functions that account for aftershock slip direction provide good constraint on the remote stress, while functions that evaluate only nodal plane orientations do not. Inversion results are stable for values of friction ≤ 0.5 on mainshock faults. We demonstrate the technique by recovering the remote stress regime at the time of the 1992 M 7.3 Landers, California earthquake from its aftershocks and find that the algorithm performs well relative to methods that invert earthquakes occurring prior to the Landers mainshock. In the mechanical inversion, incorporating fault structures is necessary, but small differences in fault geometries do not impact these inversion results. Each inversion provides a complete solution for an earthquake as output, including fault slip and the stress and deformation fields around the fault(s). This allows for many additional datasets to be used as input, including fault surface slip, GPS data, InSAR data, and/or secondary fracture orientations.

  10. Short-period strain (0.1-105 s): Near-source strain field for an earthquake (M L 3.2) near San Juan Bautista, California

    Science.gov (United States)

    Johnston, M. J. S.; Borcherdt, R. D.; Linde, A. T.

    1986-10-01

    Measurements of dilational earth strain in the frequency band 25-10-5 Hz have been made on a deep borehole strainmeter installed near the San Andreas fault. These data are used to determine seismic radiation fields during nuclear explosions, teleseisms, local earthquakes, and ground noise during seismically quiet times. Strains of less than 10-10 on these instruments can be clearly resolved at short periods (< 10 s) and are recorded with wide dynamic range digital recorders. This permits measurement of the static and dynamic strain variations in the near field of local earthquakes. Noise spectra for earth strain referenced to 1 (strain)2/Hz show that strain resolution decreases at about 10 dB per decade of frequency from -150 dB at 10-4 Hz to -223 dB at 10 Hz. Exact expressions are derived to relate the volumetric strain and displacement field for a homogeneous P wave in a general viscoelastic solid as observed on colocated dilatometers and seismometers. A rare near-field recording of strain and seismic velocity was obtained on May 26, 1984, from an earthquake (ML 3.2) at a hypocentral distance of 3.2 km near the San Andreas fault at San Juan Bautista, California. While the data indicate no precursory strain release at the 5 × 10-11 strain level, a coseismic strain release of 1.86 nanostrain was observed. This change in strain is consistent with that calculated from a simple dislocation model of the event. Ground displacement spectra, determined from the downhole strain data and instrument-corrected surface seismic data, suggest that source parameters estimated from surface recordings may be contaminated by amplification effects in near-surface low-velocity materials.

  11. A reevaluation of the Pallett Creek earthquake chronology based on new AMS radiocarbon dates, San Andreas fault, California

    Science.gov (United States)

    Scharer, K.M.; Biasi, G.P.; Weldon, R.J.

    2011-01-01

    The Pallett Creek paleoseismic record occupies a keystone position in most attempts to develop rupture histories for the southern San Andreas fault. Previous estimates of earthquake ages at Pallett Creek were determined by decay counting radiocarbon methods. That method requires large samples which can lead to unaccounted sources of uncertainty in radiocarbon ages because of the heterogeneous composition of organic layers. In contrast, accelerator mass spectrometry (AMS) radiocarbon dates may be obtained from small samples that have known carbon sources and also allow for a more complete sampling of the section. We present 65 new AMS radiocarbon dates that span nine ground-rupturing earthquakes at Pallett Creek. Overall, the AMS dates are similar to and reveal no dramatic bias in the conventional dates. For many layers, however, individual charcoal samples were younger than the conventional dates, leading to earthquake ages that are overall slightly younger than previously reported. New earthquake ages are determined by Bayesian refinement of the layer ages based on stratigraphic ordering and sedimentological constraints. The new chronology is more regular than previously published records in large part due to new samples constraining the age of event R. The closed interval from event C to 1857 has a mean recurrence of 135years (?? = 83.2 years) and a quasiperiodic coefficient of variation (COV) of 0.61. We show that the new dates and resultant earthquake chronology have a stronger effect on COV than the specific membership of this long series and dating precision improvements from sedimentation rates. Copyright 2011 by the American Geophysical Union.

  12. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  13. Radioactive waste shipments to Hanford Retrievable Storage from the General Electric Vallecitos Nuclear Center, Pleasanton, California

    International Nuclear Information System (INIS)

    Vejvoda, E.J.; Pottmeyer, J.A.; DeLorenzo, D.S.; Weyns-Rollosson, M.I.; Duncan, D.R.

    1993-10-01

    During the next two decades the transuranic (TRU) wastes now stored in the burial trenches and storage facilities at the Hanford Site are to be retrieved, processed at the Waste Receiving and Processing Facility, and shipped to the Waste Isolation Pilot Plant near Carlsbad, New Mexico for final disposal. Approximately 3.8% of the TRU waste to be retrieved for shipment to WIPP was generated at the General Electric (GE) Vallecitos Nuclear Center (VNC) in Pleasanton, California and shipped to the Hanford Site for storage. The purpose of this report is to characterize these radioactive solid wastes using process knowledge, existing records, and oral history interviews. The waste was generated almost exclusively from the activities, of the Plutonium Fuels Development Laboratory and the Plutonium Analytical Laboratory. Section 2.0 provides further details of the VNC physical plant, facility operations, facility history, and current status. The solid radioactive wastes were associated with two US Atomic Energy Commission/US Department of Energy reactor programs -- the Fast Ceramic Reactor (FCR) program, and the Fast Flux Test Reactor (FFTR) program. These programs involved the fabrication and testing of fuel assemblies that utilized plutonium in an oxide form. The types and estimated quantities of waste resulting from these programs are discussed in detail in Section 3.0. A detailed discussion of the packaging and handling procedures used for the VNC radioactive wastes shipped to the Hanford Site is provided in Section 4.0. Section 5.0 provides an in-depth look at this waste including the following: weight and volume of the waste, container types and numbers, physical description of the waste, radiological components, hazardous constituents, and current storage/disposal locations

  14. Radioactive waste shipments to Hanford Retrievable Storage from the General Electric Vallecitos Nuclear Center, Pleasanton, California

    Energy Technology Data Exchange (ETDEWEB)

    Vejvoda, E.J.; Pottmeyer, J.A.; DeLorenzo, D.S.; Weyns-Rollosson, M.I. [Los Alamos Technical Associates, Inc., NM (United States); Duncan, D.R. [Westinghouse Hanford Co., Richland, WA (United States)

    1993-10-01

    During the next two decades the transuranic (TRU) wastes now stored in the burial trenches and storage facilities at the Hanford Site are to be retrieved, processed at the Waste Receiving and Processing Facility, and shipped to the Waste Isolation Pilot Plant near Carlsbad, New Mexico for final disposal. Approximately 3.8% of the TRU waste to be retrieved for shipment to WIPP was generated at the General Electric (GE) Vallecitos Nuclear Center (VNC) in Pleasanton, California and shipped to the Hanford Site for storage. The purpose of this report is to characterize these radioactive solid wastes using process knowledge, existing records, and oral history interviews. The waste was generated almost exclusively from the activities, of the Plutonium Fuels Development Laboratory and the Plutonium Analytical Laboratory. Section 2.0 provides further details of the VNC physical plant, facility operations, facility history, and current status. The solid radioactive wastes were associated with two US Atomic Energy Commission/US Department of Energy reactor programs -- the Fast Ceramic Reactor (FCR) program, and the Fast Flux Test Reactor (FFTR) program. These programs involved the fabrication and testing of fuel assemblies that utilized plutonium in an oxide form. The types and estimated quantities of waste resulting from these programs are discussed in detail in Section 3.0. A detailed discussion of the packaging and handling procedures used for the VNC radioactive wastes shipped to the Hanford Site is provided in Section 4.0. Section 5.0 provides an in-depth look at this waste including the following: weight and volume of the waste, container types and numbers, physical description of the waste, radiological components, hazardous constituents, and current storage/disposal locations.

  15. Soil-gas Radon Emanation in Active Hydrothermal Areas at Lassen Volcanic Center, Northern California

    Science.gov (United States)

    Chan, T.; Ararso, I.; Yanez, F.; Swamy, V.; Brandon, J.; Bartelt, E.; Cuff, K. E.

    2004-12-01

    Located along the Southern Cascade Range in Northern California, the Lassen Volcanic Center is one of the youngest major Cascade volcanoes. Aside from Mount Saint Helens, Lassen is the only Cascade volcano to erupt in the 20th century. In an effort to assess outgassing in and around Lassen, and to provide information that will contribute to a better understanding of its hydrothermal system, we have conducted detailed soil-gas radon emanation surveys in several active hydrothermal areas, which possess bubbling mud pots, steaming fumaroles, and boiling hot springs. Dozens of measurements have been made in each of these areas, which are then used to create maps that indicate areas of high outgassing. These maps are then employed to assess the degree to which volcanic and other gases are currently being emitted at Lassen, as well as to investigate patterns associated with these emissions. The mean of measurements made in a specific survey area is considered to represent the average radon flux in that area. Individual values exceeding the mean plus one standard deviation are considered to represent anomalously high emanation, while values less than the mean minus one standard deviation represent anomalously low emanation. Based on preliminary analysis of data collected so far, significant outgassing occurs along well-defined, northwest-southeast trending elongate zones in several areas. Values obtained in these zones are as much as three times background radon flux. These zones are believed to contain fractures that act as pathways for migrating gases. The results of studies conducted thus far indicate that further emanation surveys will generate very useful information.

  16. Spatial variations in fault friction related to lithology from rupture and afterslip of the 2014 South Napa, California, earthquake

    Science.gov (United States)

    Michael Floyd,; Richard Walters,; John Elliot,; Funning, Gareth J.; Svarc, Jerry L.; Murray, Jessica R.; Andy Hooper,; Yngvar Larsen,; Petar Marinkovic,; Bürgmann, Roland; Johanson, Ingrid; Tim Wright,

    2016-01-01

    Following earthquakes, faults are often observed to continue slipping aseismically. It has been proposed that this afterslip occurs on parts of the fault with rate-strengthening friction that are stressed by the mainshock, but our understanding has been limited by a lack of immediate, high-resolution observations. Here we show that the behavior of afterslip following the 2014 South Napa earthquake varied over distances of only a few kilometers. This variability cannot be explained by coseismic stress changes alone. We present daily positions from continuous and survey GPS sites that we re-measured within 12 hours of the mainshock, and surface displacements from the new Sentinel-1 radar mission. This unique geodetic data set constrains the distribution and evolution of coseismic and postseismic fault slip with exceptional resolution in space and time. We suggest that the observed heterogeneity in behavior is caused by lithological controls on the frictional properties of the fault plane.

  17. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  18. Earthquake clustering inferred from Pliocene Gilbert-type fan deltas in the Loreto basin, Baja California Sur, Mexico

    Science.gov (United States)

    Dorsey, Rebecca J.; Umhoefer, Paul J.; Falk, Peter D.

    1997-08-01

    A stacked sequence of Pliocene Gilbert-type fan deltas in the Loreto basin was shed from the footwall of the dextral-normal Loreto fault and deposited at the margin of a marine basin during rapid fault-controlled subsidence. Fan-delta parasequences coarsen upward from marine siltstone and sandstone at the base, through sandy bottomsets and gravelly foresets, to gravelly nonmarine topsets. Each topset unit is capped by a thin shell bed that records marine flooding of the delta plain. Several mechanisms may have produced repetitive vertical stacking of Gilbert deltas: (1) autocyclic delta-lobe switching; (2) eustatic sea-level fluctuations; (3) climatically controlled fluctuations in sediment input; and (4) episodic subsidence produced by temporal clustering of earthquakes. We favor hypothesis 4 for several reasons, but hypotheses 2 and 3 cannot be rejected at this time. Earthquake clustering can readily produce episodic subsidence at spatial and temporal scales consistent with stratigraphic trends observed in the Loreto basin. This model is supported by comparison with paleoseismological studies that document clustering on active faults over a wide range of time scales. Earthquake clustering is a new concept in basin analysis that may be helpful for understanding repetitive stratigraphy in tectonically active basins.

  19. On the resolution of shallow mantle viscosity structure using post-earthquake relaxation data: Application to the 1999 Hector Mine, California, earthquake

    Science.gov (United States)

    Pollitz, Fred F.; Thatcher, Wayne R.

    2010-01-01

    Most models of lower crust/mantle viscosity inferred from postearthquake relaxation assume one or two uniform-viscosity layers. A few existing models possess apparently significant radially variable viscosity structure in the shallow mantle (e.g., the upper 200 km), but the resolution of such variations is not clear. We use a geophysical inverse procedure to address the resolving power of inferred shallow mantle viscosity structure using postearthquake relaxation data. We apply this methodology to 9 years of GPS-constrained crustal motions after the 16 October 1999 M = 7.1 Hector Mine earthquake. After application of a differencing method to isolate the postearthquake signal from the “background” crustal velocity field, we find that surface velocities diminish from ∼20 mm/yr in the first few months to ≲2 mm/yr after 2 years. Viscoelastic relaxation of the mantle, with a time-dependent effective viscosity prescribed by a Burgers body, provides a good explanation for the postseismic crustal deformation, capturing both the spatial and temporal pattern. In the context of the Burgers body model (which involves a transient viscosity and steady state viscosity), a resolution analysis based on the singular value decomposition reveals that at most, two constraints on depth-dependent steady state mantle viscosity are provided by the present data set. Uppermost mantle viscosity (depth ≲ 60 km) is moderately resolved, but deeper viscosity structure is poorly resolved. The simplest model that explains the data better than that of uniform steady state mantle viscosity involves a linear gradient in logarithmic viscosity with depth, with a small increase from the Moho to 220 km depth. However, the viscosity increase is not statistically significant. This suggests that the depth-dependent steady state viscosity is not resolvably different from uniformity in the uppermost mantle.

  20. Natural magnetic disturbance fields, not precursors, preceding the Loma Prieta earthquake

    Science.gov (United States)

    Campbell, Wallace H.

    2009-05-01

    Available records of the magnetic indices Dst and ap together with standard observatory recordings of 1-min field levels were examined for the period preceding the earthquake of October 1989, centered near Loma Prieta, California. The magnetic records showed that the Fraser-Smith et al. (1990) report claiming the existence of a 100-s (ultralow frequency) geomagnetic field precursor signal at Corralitos, California, foretelling a nearby earthquake is not valid. My study shows that the Stanford ULF signal was not local but rather widespread throughout the western United States and, therefore, expected to be due to a coincidental geomagnetic solar-terrestrial disturbance field.

  1. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  2. Earthquake Early Warning: Real-time Testing of an On-site Method Using Waveform Data from the Southern California Seismic Network

    Science.gov (United States)

    Solanki, K.; Hauksson, E.; Kanamori, H.; Wu, Y.; Heaton, T.; Boese, M.

    2007-12-01

    We have implemented an on-site early warning algorithm using the infrastructure of the Caltech/USGS Southern California Seismic Network (SCSN). We are evaluating the real-time performance of the software system and the algorithm for rapid assessment of earthquakes. In addition, we are interested in understanding what parts of the SCSN need to be improved to make early warning practical. Our EEW processing system is composed of many independent programs that process waveforms in real-time. The codes were generated by using a software framework. The Pd (maximum displacement amplitude of P wave during the first 3sec) and Tau-c (a period parameter during the first 3 sec) values determined during the EEW processing are being forwarded to the California Integrated Seismic Network (CISN) web page for independent evaluation of the results. The on-site algorithm measures the amplitude of the P-wave (Pd) and the frequency content of the P-wave during the first three seconds (Tau-c). The Pd and the Tau-c values make it possible to discriminate between a variety of events such as large distant events, nearby small events, and potentially damaging nearby events. The Pd can be used to infer the expected maximum ground shaking. The method relies on data from a single station although it will become more reliable if readings from several stations are associated. To eliminate false triggers from stations with high background noise level, we have created per station Pd threshold configuration for the Pd/Tau-c algorithm. To determine appropriate values for the Pd threshold we calculate Pd thresholds for stations based on the information from the EEW logs. We have operated our EEW test system for about a year and recorded numerous earthquakes in the magnitude range from M3 to M5. Two recent examples are a M4.5 earthquake near Chatsworth and a M4.7 earthquake near Elsinore. In both cases, the Pd and Tau-c parameters were determined successfully within 10 to 20 sec of the arrival of the

  3. Solar-B E/PO Program at Chabot Space and Science Center, Oakland, California

    Science.gov (United States)

    Burress, B. S.

    2005-05-01

    Chabot Space and Science Center in Oakland, California, conducts the Education/Public Outreach program for the Lockheed-Martin Solar and Astrophysics Lab Solar-B Focal Plane Package project. Since opening its doors in August 2000, Chabot has carried out this program in activities and educational products in the public outreach, informal education, and formal education spheres. We propose a poster presentation that illustrates the spectrum of our Solar-B E/PO program. Solar-B, scheduled to launch in September 2006, is another step in an increasingly sophisticated investigation and understanding of our Sun, its behavior, and its effects on the Earth and our technological civilization. A mission of the Japan Aerospace Exploration Agency (JAXA), Solar-B is an international collaboration between Japan, the US/NASA, and the UK/PPARC. Solar-B's main optical telescope, extreme ultraviolet imaging spectrometer, and x-ray telescope will collect data on the Sun's magnetic dynamics from the photosphere through the corona at higher spatial and time resolution than on current and previous solar satellite missions, furthering our understanding of the Sun's behavior and, ultimately, its effects on the Earth. Chabot's E/PO program for the Lockheed-Martin Solar-B Focal Plane Package is multi-faceted, including elements focused on technology/engineering, solar physics, and Sun-Earth Connection themes. In the Public Outreach arena, we conduct events surrounding NASA Sun-Earth Day themes and programs other live and/or interactive events, facilitate live solar viewing, and present a series of exhibits focused on the Solar-B and other space-based missions, the dynamic Sun, and light and optics. In the Informal Education sector we run a solar day camp for kids and produce educational products, including a poster on the Solar-B mission and CDROM multimedia packages. In Formal Education, we develop classroom curriculum guides and conduct workshops training teachers in their implementation

  4. Federal Labs and Research Centers Benefiting California: 2017 Impact Report for State Leaders.

    Energy Technology Data Exchange (ETDEWEB)

    Koning, Patricia Brady [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-12-01

    Sandia National Laboratories is the largest of the Department of Energy national laboratories with more than 13,000 staff spread across its two main campuses in New Mexico and California. For more than 60 years, the Sandia National Laboratories campus in Livermore, California has delivered cutting-edge science and technology solutions to resolve the nation’s most challenging and complex problems. As a multidisciplinary laboratory, Sandia draws from virtually every science and engineering discipline to address challenges in energy, homeland security, cybersecurity, climate, and biosecurity. Today, collaboration is vital to ensuring that the Lab stays at the forefront of science and technology innovation. Partnerships with industry, state, and local governments, and California universities help drive innovation and economic growth in the region. Sandia contributed to California’s regional and statewide economy with more than $145 million in contracts to California companies, $92 million of which goes to California small businesses. In addition, Sandia engages the community directly by running robust STEM education programs for local schools and administering community giving programs. Meanwhile, investments like the Livermore Valley Open Campus (LVOC), an innovation hub supported by LLNL and Sandia, help catalyze the local economy.

  5. Source Functions and Path Effects from Earthquakes in the Farallon Transform Fault Region, Gulf of California, Mexico that Occurred on October 2013

    Science.gov (United States)

    Castro, Raúl R.; Stock, Joann M.; Hauksson, Egill; Clayton, Robert W.

    2017-06-01

    We determined source spectral functions, Q and site effects using regional records of body waves from the October 19, 2013 ( M w = 6.6) earthquake and eight aftershocks located 90 km east of Loreto, Baja California Sur, Mexico. We also analyzed records from a foreshock with magnitude 3.3 that occurred 47 days before the mainshock. The epicenters of this sequence are located in the south-central region of the Gulf of California (GoC) near and on the Farallon transform fault. This is one of the most active regions of the GoC, where most of the large earthquakes have strike-slip mechanisms. Based on the distribution of the aftershocks, the rupture propagated northwest with a rupture length of approximately 27 km. We calculated 3-component P- and S-wave spectra from ten events recorded by eleven stations of the Broadband Seismological Network of the GoC (RESBAN). These stations are located around the GoC and provide good azimuthal coverage (the average station gap is 39°). The spectral records were corrected for site effects, which were estimated calculating average spectral ratios between horizontal and vertical components (HVSR method). The site-corrected spectra were then inverted to determine the source functions and to estimate the attenuation quality factor Q. The values of Q resulting from the spectral inversion can be approximated by the relations Q_{{P}} = 48.1 ± 1.1 f^{0.88 ± 0.04} and Q_{{S}} = 135.4 ± 1.1 f^{0.58 ± 0.03} and are consistent with previous estimates reported by Vidales-Basurto et al. (Bull Seism Soc Am 104:2027-2042, 2014) for the south-central GoC. The stress drop estimates, obtained using the ω2 model, are below 1.7 MPa, with the highest stress drops determined for the mainshock and the aftershocks located in the ridge zone. We used the values of Q obtained to recalculate source and site effects with a different spectral inversion scheme. We found that sites with low S-wave amplification also tend to have low P-wave amplification

  6. Prevalences of zoonotic bacteria among seabirds in rehabilitation centers along the Pacific Coast of California and Washington, USA.

    Science.gov (United States)

    Steele, Christine M; Brown, Richard N; Botzler, Richard G

    2005-10-01

    Many seabirds are rehabilitated annually by wildlife rehabilitation centers along the Pacific Coast, USA. Although various strains of zoonotic bacteria have been isolated from seabirds, risks to rehabilitators at these centers have not been well documented. From November 2001 through January 2003, we determined the prevalence of detectable enteric fauna by isolation and characterization of Gram-negative bacteria from cloacal swabs taken from 26 common murres (Uria aalge), 49 gulls (Larus spp.), and 14 other seabirds treated by rehabilitators in California and Washington (USA). At least 25 bacterial species were identified, including multiple strains of Escherichia coli, as well as Enterobacter cloacae, Citrobacter freundii, and Klebsiella pneumoniae. Antibiotic resistance was found in 13 of 19 bacterial isolates tested, including E. coli, K. pneumoniae, Acinetobacter baumanii, and Pseudomonas aeruginosa. Potential transfer of these bacteria poses a risk to wildlife rehabilitators and to seabirds in these centers, as well as to free-ranging birds.

  7. Implications of the World Trade Center Health Program (WTCHP) for the public health response to the Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Crane, Michael A.; Cho, Hyunje G.; Landrigan, Phillip J.

    2014-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. (author)

  8. Implications of the World Trade Center Health Program (WTCHP) for the public health response to the Great East Japan Earthquake.

    Science.gov (United States)

    Crane, Michael A; Cho, Hyunje G; Landrigan, Phillip J

    2014-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima.

  9. Slip Rates, Recurrence Intervals and Earthquake Event Magnitudes for the southern Black Mountains Fault Zone, southern Death Valley, California

    Science.gov (United States)

    Fronterhouse Sohn, M.; Knott, J. R.; Bowman, D. D.

    2005-12-01

    The normal-oblique Black Mountain Fault zone (BMFZ) is part of the Death Valley fault system. Strong ground-motion generated by earthquakes on the BMFZ poses a serious threat to the Las Vegas, NV area (pop. ~1,428,690), the Death Valley National Park (max. pop. ~20,000) and Pahrump, NV (pop. 30,000). Fault scarps offset Holocene alluvial-fan deposits along most of the 80-km length of the BMFZ. However, slip rates, recurrence intervals, and event magnitudes for the BMFZ are poorly constrained due to a lack of age control. Also, Holocene scarp heights along the BMFZ range from 6 m suggesting that geomorphic sections have different earthquake histories. Along the southernmost section, the BMFZ steps basinward preserving three post-late Pleistocene fault scarps. Surveys completed with a total station theodolite show scarp heights of 5.5, 5.0 and 2 meters offsetting the late Pleistocene, early to middle Holocene, to middle-late Holocene surfaces, respectively. Regression plots of vertical offset versus maximum scarp angle suggest event ages of <10 - 2 ka with a post-late Pleistocene slip rate of 0.1mm/yr to 0.3 mm/yr and recurrence of <3300 years/event. Regression equations for the estimated geomorphically constrained rupture length of the southernmost section and surveyed event displacements provides estimated moment magnitudes (Mw) between 6.6 and 7.3 for the BMFZ.

  10. Petrology and Geochemistry of Abandoned Spreading Center Lavas Off Baja California: Implications for Intraplate Magmatism in Eastern Pacific

    Science.gov (United States)

    Tian, L.; Castillo, P. R.; Lonsdale, P. F.

    2008-12-01

    Abundant volcanism at active spreading centers is caused by adiabatic decompression melting of the upper mantle, but the origin of volcanism at abandoned spreading centers is an enigma. Guadalupe Island and Sara, Rosana, Rosa, and Nithya seamounts are volcanoes built on abandoned spreading centers between 26°N and 29°N in the eastern Pacific offshore Baja California. Lava samples from these volcanoes comprise predominantly of mildly to moderately alkalic basalts and their differentiates. Relative to mid-ocean ridge basalts (MORB) from the East Pacific Rise (EPR), they have higher abundances of incompatible elements and higher highly/moderately incompatible trace element ratios (e.g., Ba/Zr ~1.3). These lavas have enriched REECH patterns, with light REE enrichment up to 300X chondrites. These trace element characteristics combined with their moderately radiogenic Sr, Nd and Pb isotopic compositions indicate they originated from a geochemically enriched mantle source. In detail, the lavas have a moderate range of composition that overlaps with those of lavas from another spreading center (Davidson Seamount) and nearby seamounts (e.g., Pioneer, Rodriguez) offshore southern California and tholeiitic to alkalic seamounts near the EPR. Together, these intraplate lavas define a compositional continuum ranging from MORB-like to ocean island basalt (OIB)-like. In the case of abandoned spreading centers, the 87Sr/86Sr and 143Nd/144Nd compositions of Sara, Rosana and Nithya seamount lavas greatly overlap with those of EPR seamount lavas, but those of Rosa seamount and Guadalupe Island lavas are within the HIMU field for OIB. Thus our results suggest that volcanism at abandoned spreading centers and intraplate volcanism in eastern Pacific as a whole result from a complex interplay between mantle melting dynamics and lithospheric tectonic processes.

  11. Student Union: The Architecture and Social Design of Postwar Campus Community Centers in California

    Science.gov (United States)

    Robinson, Clare Montomgery

    2012-01-01

    This dissertation examines the architecture and social intent of Student Union buildings. The narrative reaches back to the first quarter of the twentieth century when students and college leaders in the Midwest and Northeast formed the Association of College Unions, but focuses on the postwar period in California when Student Unions became…

  12. Proceedings of the 11th United States-Japan natural resources panel for earthquake research, Napa Valley, California, November 16–18, 2016

    Science.gov (United States)

    Detweiler, Shane; Pollitz, Fred

    2017-10-18

    The UJNR Panel on Earthquake Research promotes advanced research toward a more fundamental understanding of the earthquake process and hazard estimation. The Eleventh Joint meeting was extremely beneficial in furthering cooperation and deepening understanding of problems common to both Japan and the United States.The meeting included productive exchanges of information on approaches to systematic observation and modeling of earthquake processes. Regarding the earthquake and tsunami of March 2011 off the Pacific coast of Tohoku and the 2016 Kumamoto earthquake sequence, the Panel recognizes that further efforts are necessary to achieve our common goal of reducing earthquake risk through close collaboration and focused discussions at the 12th UJNR meeting.

  13. A Case Study of Key Stakeholders' Perceptions of the Learning Center's Effectiveness for English Learners at a District in Central California

    Science.gov (United States)

    Nava, Norma Leticia

    2016-01-01

    This qualitative study explored stakeholders' (administrators, teachers, and parents) perspectives of English learners in the learning center, a response to intervention model, at a school district in Central California. Research existed concerning the yearly academic growth of students in a learning center, but there was a lack of knowledge about…

  14. Radiation Doses in Consecutive CT Examinations from Five University of California Medical Centers.

    Science.gov (United States)

    Smith-Bindman, Rebecca; Moghadassi, Michelle; Wilson, Nicole; Nelson, Thomas R; Boone, John M; Cagnon, Christopher H; Gould, Robert; Hall, David J; Krishnam, Mayil; Lamba, Ramit; McNitt-Gray, Michael; Seibert, Anthony; Miglioretti, Diana L

    2015-10-01

    To summarize data on computed tomographic (CT) radiation doses collected from consecutive CT examinations performed at 12 facilities that can contribute to the creation of reference levels. The study was approved by the institutional review boards of the collaborating institutions and was compliant with HIPAA. Radiation dose metrics were prospectively and electronically collected from 199 656 consecutive CT examinations in 83 181 adults and 3871 consecutive CT examinations in 2609 children at the five University of California medical centers during 2013. The median volume CT dose index (CTDIvol), dose-length product (DLP), and effective dose, along with the interquartile range (IQR), were calculated separately for adults and children and stratified according to anatomic region. Distributions for DLP and effective dose are reported for single-phase examinations, multiphase examinations, and all examinations. For adults, the median CTDIvol was 50 mGy (IQR, 37-62 mGy) for the head, 12 mGy (IQR, 7-17 mGy) for the chest, and 12 mGy (IQR, 8-17 mGy) for the abdomen. The median DLPs for single-phase, multiphase, and all examinations, respectively, were as follows: head, 880 mGy · cm (IQR, 640-1120 mGy · cm), 1550 mGy · cm (IQR, 1150-2130 mGy · cm), and 960 mGy · cm (IQR, 690-1300 mGy · cm); chest, 420 mGy · cm (IQR, 260-610 mGy · cm), 880 mGy · cm (IQR, 570-1430 mGy · cm), and 550 mGy · cm (IQR 320-830 mGy · cm); and abdomen, 580 mGy · cm (IQR, 360-860 mGy · cm), 1220 mGy · cm (IQR, 850-1790 mGy · cm), and 960 mGy · cm (IQR, 600-1460 mGy · cm). Median effective doses for single-phase, multiphase, and all examinations, respectively, were as follows: head, 2 mSv (IQR, 1-3 mSv), 4 mSv (IQR, 3-8 mSv), and 2 mSv (IQR, 2-3 mSv); chest, 9 mSv (IQR, 5-13 mSv), 18 mSv (IQR, 12-29 mSv), and 11 mSv (IQR, 6-18 mSv); and abdomen, 10 mSv (IQR, 6-16 mSv), 22 mSv (IQR, 15-32 mSv), and 17 mSv (IQR, 11-26 mSv). In general, values for children were approximately 50% those

  15. High-resolution seismic reflection/refraction imaging from Interstate 10 to Cherry Valley Boulevard, Cherry Valley, Riverside County, California: implications for water resources and earthquake hazards

    Science.gov (United States)

    Gandhok, G.; Catchings, R.D.; Goldman, M.R.; Horta, E.; Rymer, M.J.; Martin, P.; Christensen, A.

    1999-01-01

    This report is the second of two reports on seismic imaging investigations conducted by the U.S. Geological Survey (USGS) during the summers of 1997 and 1998 in the Cherry Valley area in California (Figure 1a). In the first report (Catchings et al., 1999), data and interpretations were presented for four seismic imaging profiles (CV-1, CV-2, CV-3, and CV-4) acquired during the summer of 1997 . In this report, we present data and interpretations for three additional profiles (CV-5, CV-6, and CV-7) acquired during the summer of 1998 and the combined seismic images for all seven profiles. This report addresses both groundwater resources and earthquake hazards in the San Gorgonio Pass area because the shallow (upper few hundred meters) subsurface stratigraphy and structure affect both issues. The cities of Cherry Valley and Beaumont are located approximately 130 km (~80 miles) east of Los Angeles, California along the southern alluvial fan of the San Bernardino Mountains (see Figure 1b). These cities are two of several small cities that are located within San Gorgonio Pass, a lower-lying area between the San Bernardino and the San Jacinto Mountains. Cherry Valley and Beaumont are desert cities with summer daytime temperatures often well above 100 o F. High water usage in the arid climate taxes the available groundwater supply in the region, increasing the need for efficient management of the groundwater resources. The USGS and the San Gorgonio Water District (SGWD) work cooperatively to evaluate the quantity and quality of groundwater supply in the San Gorgonio Pass region. To better manage the water supplies within the District during wet and dry periods, the SGWD sought to develop a groundwater recharge program, whereby, excess water would be stored in underground aquifers during wet periods (principally winter months) and retrieved during dry periods (principally summer months). The SGWD preferred a surface recharge approach because it could be less expensive than a

  16. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  17. Long Return Periods for Earthquakes in San Gorgonio Pass and Implications for Large Ruptures of the San Andreas Fault in Southern California

    Science.gov (United States)

    Yule, J.; McBurnett, P.; Ramzan, S.

    2011-12-01

    The largest discontinuity in the surface trace of the San Andreas fault occurs in southern California at San Gorgonio Pass. Here, San Andreas motion moves through a 20 km-wide compressive stepover on the dextral-oblique-slip thrust system known as the San Gorgonio Pass fault zone. This thrust-dominated system is thought to rupture during very large San Andreas events that also involve strike-slip fault segments north and south of the Pass region. A wealth of paleoseismic data document that the San Andreas fault segments on either side of the Pass, in the San Bernardino/Mojave Desert and Coachella Valley regions, rupture on average every ~100 yrs and ~200 yrs, respectively. In contrast, we report here a notably longer return period for ruptures of the San Gorgonio Pass fault zone. For example, features exposed in trenches at the Cabezon site reveal that the most recent earthquake occurred 600-700 yrs ago (this and other ages reported here are constrained by C-14 calibrated ages from charcoal). The rupture at Cabezon broke a 10 m-wide zone of east-west striking thrusts and produced a >2 m-high scarp. Slip during this event is estimated to be >4.5 m. Evidence for a penultimate event was not uncovered but presumably lies beneath ~1000 yr-old strata at the base of the trenches. In Millard Canyon, 5 km to the west of Cabezon, the San Gorgonio Pass fault zone splits into two splays. The northern splay is expressed by 2.5 ± 0.7 m and 5.0 ± 0.7 m scarps in alluvial terraces constrained to be ~1300 and ~2500 yrs old, respectively. The scarp on the younger, low terrace postdates terrace abandonment ~1300 yrs ago and probably correlates with the 600-700 yr-old event at Cabezon, though we cannot rule out that a different event produced the northern Millard scarp. Trenches excavated in the low terrace reveal growth folding and secondary faulting and clear evidence for a penultimate event ~1350-1450 yrs ago, during alluvial deposition prior to the abandonment of the low terrace

  18. A precast concrete bridge bent designed to re-center after an earthquake : research report, October 2008.

    Science.gov (United States)

    2008-10-01

    In this study the post-earthquake residual displacements of reinforced concrete bridge bents were investigated. The system had mild steel that was intended to dissipate energy and an unbonded, post-tensioned tendon that was supposed to remain elastic...

  19. A precast concrete bridge bent designed to re-center after an earthquake : draft research report, August 2008.

    Science.gov (United States)

    2008-08-01

    In this study the post-earthquake residual displacements of reinforced concrete bridge bents were investigated. The system had mild steel that was intended to dissipate energy and an unbonded, post-tensioned tendon that was supposed to remain elastic...

  20. Navato Center Regulatory Permit Application by Novato Center Inc. Marin County, California Public Notice 10138-33R.

    Science.gov (United States)

    1980-11-01

    no technical evaluation of the strength of the Novato Creek levees. However, it appears that because of their age, settlement, and probable method of...Shopping Center in San Rafael, can vary in size from 300,000 to over 1,000,000 square feet of retail area. The strength of regional centers is dependent...i :aaiB suop1iluaouOD apixouow uoqz13£01 ip.1BpuV39 a4.L -suopea~l UO *]JlXOUxOmf uoqz1N J03 pzIQU&29 aIa peaxe 4iSl~ lelutj PSOA a9S044 SA14~1H] ,,m

  1. Final Report Feasibility Study for the California Wave Energy Test Center (CalWavesm) - Volume #2 - Appendices #16-17

    Energy Technology Data Exchange (ETDEWEB)

    Dooher, Brendan [Pacific Gas and Electric Company, San Ramon, CA (United States). Applied Technical Services; Toman, William I. [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States). Inst. of Advanced Technology and Public Policy; Davy, Doug M. [CH2M Hill Engineers, Inc., Sacramento, CA (United States); Blakslee, Samuel N. [California Polytechnic State Univ. (CalPoly), San Luis Obispo, CA (United States)

    2017-07-31

    The California Wave Energy Test Center (CalWave) Feasibility Study project was funded over multiple phases by the Department of Energy to perform an interdisciplinary feasibility assessment to analyze the engineering, permitting, and stakeholder requirements to establish an open water, fully energetic, grid connected, wave energy test center off the coast of California for the purposes of advancing U.S. wave energy research, development, and testing capabilities. Work under this grant included wave energy resource characterization, grid impact and interconnection requirements, port infrastructure and maritime industry capability/suitability to accommodate the industry at research, demonstration and commercial scale, and macro and micro siting considerations. CalWave Phase I performed a macro-siting and down-selection process focusing on two potential test sites in California: Humboldt Bay and Vandenberg Air Force Base. This work resulted in the Vandenberg Air Force Base site being chosen as the most favorable site based on a peer reviewed criteria matrix. CalWave Phase II focused on four siting location alternatives along the Vandenberg Air Force Base coastline and culminated with a final siting down-selection. Key outcomes from this work include completion of preliminary engineering and systems integration work, a robust turnkey cost estimate, shoreside and subsea hazards assessment, storm wave analysis, lessons learned reports from several maritime disciplines, test center benchmarking as compared to existing international test sites, analysis of existing applicable environmental literature, the completion of a preliminary regulatory, permitting and licensing roadmap, robust interaction and engagement with state and federal regulatory agency personnel and local stakeholders, and the population of a Draft Federal Energy Regulatory Commission (FERC) Preliminary Application Document (PAD). Analysis of existing offshore oil and gas infrastructure was also performed

  2. The Pulse Azimuth effect as seen in induction coil magnetometers located in California and Peru 2007–2010, and its possible association with earthquakes

    Directory of Open Access Journals (Sweden)

    J. C. Dunson

    2011-07-01

    Full Text Available The QuakeFinder network of magnetometers has recorded geomagnetic field activity in California since 2000. Established as an effort to follow up observations of ULF activity reported from before and after the M = 7.1 Loma Prieta earthquake in 1989 by Stanford University, the QuakeFinder network has over 50 sites, fifteen of which are high-resolution QF1005 and QF1007 systems. Pairs of high-resolution sites have also been installed in Peru and Taiwan.

    Increases in pulse activity preceding nearby seismic events are followed by decreases in activity afterwards in the three cases that are discussed here. In addition, longer term data is shown, revealing a rich signal structure not previously known in QuakeFinder data, or by many other authors who have reported on pre-seismic ULF phenomena. These pulses occur as separate ensembles, with demonstrable repeatability and uniqueness across a number of properties such as waveform, angle of arrival, amplitude, and duration. Yet they appear to arrive with exponentially distributed inter-arrival times, which indicates a Poisson process rather than a periodic, i.e., stationary process.

    These pulses were observed using three-axis induction coil magnetometers that are buried 1–2 m under the surface of the Earth. Our sites use a Nyquist frequency of 16 Hertz (25 Hertz for the new QF1007 units, and they record these pulses at amplitudes from 0.1 to 20 nano-Tesla with durations of 0.1 to 12 s. They are predominantly unipolar pulses, which may imply charge migration, and they are stronger in the two horizontal (north-south and east-west channels than they are in the vertical channels. Pulses have been seen to occur in bursts lasting many hours. The pulses have large amplitudes and study of the three-axis data shows that the amplitude ratios of the pulses taken from pairs of orthogonal coils is stable across the bursts, suggesting a similar source.

    This paper presents three

  3. SEISMIC PICTURE OF A FAULT ZONE. WHAT CAN BE GAINED FROM THE ANALYSIS OF FINE PATTERNS OF SPATIAL DISTRIBUTION OF WEAK EARTHQUAKE CENTERS?

    Directory of Open Access Journals (Sweden)

    Gevorg G. Kocharyan

    2010-01-01

    Full Text Available Association of earthquake hypocenters with fault zones appears more pronounced in cases with more accurately determined positions of the earthquakes. For complex, branched structures of major fault zones, it is assumed that some of the earthquakes occur at feathering fractures of smaller scale.It is thus possible to develop a «seismological» criterion for definition of a zone of dynamic influence of faults, i.e. the zone containing the majority of earthquakes associated with the fault zone under consideration.In this publication, seismogenic structures of several fault zones located in the San-Andreas fault system are reviewed. Based on the data from a very dense network of digital seismic stations installed in this region and with application of modern data processing methods, differential coordinates of microearthquakes can be determined with errors of about first dozens of meters.It is thus possible to precisely detect boundaries of the areas wherein active deformation processes occur and to reveal spatial patterns of seismic event localization.In our analyses, data from the most comprehensive seismic catalog were used. The catalogue includes information on events which occurred and were registered in North California in the period between January 1984 and May 2003. In this publication, the seismic data processing results and regularities revealed during the analyses are compared with the data obtained from studies of fault structures, modeling and numerical simulation results. Results of quantitative research of regularities of localization of seismic sources inside fault zones are presented.It is demonstrated by 3D models that seismic events are localized in the vicinity of an almost plain surface with a nearly constant angle of dip, the majority of events being concentrated at that conventional surface.Detection of typical scopes of seismicity localization may prove critical for solution of problems of technogenic impact on fault zones

  4. The Rule of Dynamic Strain to Near Source Aftershock Distribution of the 2014, Mw 6.0, Napa (California) Earthquake

    Science.gov (United States)

    Emolo, A.; De Matteis, R.; Convertito, V.

    2015-12-01

    The 2014 Napa was recognized as a right-lateral strike-slip fault. About 400 aftershocks occurred, mainly in the near-source range, in the two months after the earthquake. They mostly occurred between 8 and 11 km depth interesting an area of about 10 km2 north-northwest-trending with respect to the mainshock hypocenter. However, the aftershock distribution was not able to constrain the mainshock fault plane. Since Parsons et al. (2014) have shown that Coulomb static stress change does not completely explain near-source aftershock distribution, we explore whether dynamic strain transfer, enhanced by source directivity, contributed to trigger the aftershock sequence. Indeed, dynamic strain transfer triggering attributes enhanced failure probabilities to increased shear stresses or strains, to permeability changes and maybe to fault weakening. In this respect, we observe that a single inverse power law fits the decay of aftershock density as function of distance from the fault plane, suggesting that dynamic stress/strain might have played a role in the aftershocks triggering. To test this hypothesis, we used Peak-Ground Velocities (PGVs) as a proxy for peak-dynamic strain/stress field, accounting for both fault finiteness and source directivity. We first use a point source to retrieve the best parameters of the directivity function from the inversion of the PGVs. Next, the same PGVs are used to jointly infer the surface fault projection and the dominant horizontal rupture direction. Finally, we map the peak-dynamic strain/stress, modified by source geometry and directivity, to resolve the relationship between the aftershocks location and the areas of large dynamic strain values. Thus, we believe that dynamic strain/stress actually contributed to the Napa aftershock distribution. Our results may help to better constrain the Napa causative fault and complement Coulomb static stress change to identify areas that will be more likely affected by aftershocks.

  5. Monitoring of fungal loads in seabird rehabilitation centers with comparisons to natural seabird environments in northern California.

    Science.gov (United States)

    Burco, Julia D; Massey, J Gregory; Byrne, Barbara A; Tell, Lisa; Clemons, Karl V; Ziccardi, Michael H

    2014-03-01

    Aspergillosis remains a major cause of mortality in captive and rehabilitated seabirds. To date, there has been poor documentation of fungal (particularly Aspergillus spp.) burdens in natural seabird loafing and roosting sites compared with fungal numbers in rehabilitation or captive settings and the various microenvironments that seabirds are exposed to during the rehabilitation process. This study compares fungal, particularly Aspergillus spp., burdens potentially encountered by seabirds in natural and rehabilitation environments. Differences among the various microenvironments in the rehabilitation facility were evaluated to determine the risk of infection when seabirds are experiencing high stress and poor immune function. Aspergillus spp. counts were quantified in three wildlife rehabilitation centers and five natural seabird loafing and roosting sites in northern California using a handheld impact air sampler and a water filtration system. Wildlife rehabilitation centers demonstrated an increase in numbers of conidia of Aspergillus spp. and Aspergillus fumigatus in air and water samples from select aquatic bird rehabilitation centers compared with natural seabird environments in northern California. Various microenvironments in the rehabilitation facility were identified as having higher numbers of conidia of Aspergillus spp. These results suggest that periodic monitoring of multiple local areas, where the birds spend time in a rehabilitation facility, should be done to identify "high risk" sites, where birds should spend minimal time, or sites that should be cleaned more frequently or have improved air flow to reduce exposure to fungal conidia. Overall, these results suggest that seabirds may be more likely to encounter Aspergillus spp. in various microenvironments in captivity, compared with their native habitats, which could increase their risk of developing disease when in a debilitated state.

  6. The Observation of Fault Finiteness and Rapid Velocity Variation in Pnl Waveforms for the Mw 6.5, San Simeon, California Earthquake

    Science.gov (United States)

    Konca, A. O.; Ji, C.; Helmberger, D. V.

    2004-12-01

    We observed the effect of the fault finiteness in the Pnl waveforms from regional distances (4° to 12° ) for the Mw6.5 San Simeon Earthquake on 22 December 2003. We aimed to include more of the high frequencies (2 seconds and longer periods) than the studies that use regional data for focal solutions (5 to 8 seconds and longer periods). We calculated 1-D synthetic seismograms for the Pn_l portion for both a point source, and a finite fault solution. The comparison of the point source and finite fault waveforms with data show that the first several seconds of the point source synthetics have considerably higher amplitude than the data, while finite fault does not have a similar problem. This can be explained by reversely polarized depth phases overlapping with the P waves from the later portion of the fault, and causing smaller amplitudes for the beginning portion of the seismogram. This is clearly a finite fault phenomenon; therefore, can not be explained by point source calculations. Moreover, the point source synthetics, which are calculated with a focal solution from a long period regional inversion, are overestimating the amplitude by three to four times relative to the data amplitude, while finite fault waveforms have the similar amplitudes to the data. Hence, a moment estimation based only on the point source solution of the regional data could have been wrong by half of magnitude. We have also calculated the shifts of synthetics relative to data to fit the seismograms. Our results reveal that the paths from Central California to the south are faster than to the paths to the east and north. The P wave arrival to the TUC station in Arizona is 4 seconds earlier than the predicted Southern California model, while most stations to the east are delayed around 1 second. The observed higher uppermost mantle velocities to the south are consistent with some recent tomographic models. Synthetics generated with these models significantly improves the fits and the

  7. Using a modified time-reverse imaging technique to locate low-frequency earthquakes on the San Andreas Fault near Cholame, California

    Science.gov (United States)

    Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.

    2015-01-01

    We present a new method to locate low-frequency earthquakes (LFEs) within tectonic tremor episodes based on time-reverse imaging techniques. The modified time-reverse imaging technique presented here is the first method that locates individual LFEs within tremor episodes within 5 km uncertainty without relying on high-amplitude P-wave arrivals and that produces similar hypocentral locations to methods that locate events by stacking hundreds of LFEs without having to assume event co-location. In contrast to classic time-reverse imaging algorithms, we implement a modification to the method that searches for phase coherence over a short time period rather than identifying the maximum amplitude of a superpositioned wavefield. The method is independent of amplitude and can help constrain event origin time. The method uses individual LFE origin times, but does not rely on a priori information on LFE templates and families.We apply the method to locate 34 individual LFEs within tremor episodes that occur between 2010 and 2011 on the San Andreas Fault, near Cholame, California. Individual LFE location accuracies range from 2.6 to 5 km horizontally and 4.8 km vertically. Other methods that have been able to locate individual LFEs with accuracy of less than 5 km have mainly used large-amplitude events where a P-phase arrival can be identified. The method described here has the potential to locate a larger number of individual low-amplitude events with only the S-phase arrival. Location accuracy is controlled by the velocity model resolution and the wavelength of the dominant energy of the signal. Location results are also dependent on the number of stations used and are negligibly correlated with other factors such as the maximum gap in azimuthal coverage, source–station distance and signal-to-noise ratio.

  8. Homogeneous catalogs of earthquakes.

    Science.gov (United States)

    Knopoff, L; Gardner, J K

    1969-08-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967.

  9. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    Science.gov (United States)

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  10. Testing hypotheses of earthquake occurrence

    Science.gov (United States)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  11. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  12. Fault-Zone Maturity Defines Maximum Earthquake Magnitude

    Science.gov (United States)

    Bohnhoff, M.; Bulut, F.; Stierle, E.; Ben-Zion, Y.

    2014-12-01

    Estimating the maximum likely magnitude of future earthquakes on transform faults near large metropolitan areas has fundamental consequences for the expected hazard. Here we show that the maximum earthquakes on different sections of the North Anatolian Fault Zone (NAFZ) scale with the duration of fault zone activity, cumulative offset and length of individual fault segments. The findings are based on a compiled catalogue of historical earthquakes in the region, using the extensive literary sources that exist due to the long civilization record. We find that the largest earthquakes (M~8) are exclusively observed along the well-developed part of the fault zone in the east. In contrast, the western part is still in a juvenile or transitional stage with historical earthquakes not exceeding M=7.4. This limits the current seismic hazard to NW Turkey and its largest regional population and economical center Istanbul. Our findings for the NAFZ are consistent with data from the two other major transform faults, the San Andreas fault in California and the Dead Sea Transform in the Middle East. The results indicate that maximum earthquake magnitudes generally scale with fault-zone evolution.

  13. Proposed Construction of the Permanent Off-Campus Center of California State University, Hayward, in Concord. A Report to the Governor and Legislature in Response to a Request for Capital Funds from the California State University for a Permanent Off-Campus Center in Contra Costa County. Report No. 87-47.

    Science.gov (United States)

    California State Postsecondary Education Commission, Sacramento.

    A proposal to move the Contra Costa Center of California State University, Hayward, from its present leased quarters in Pleasant Hill to a permanent facility in Concord is presented. The historical background on the proposal for the center is discussed. The proposal is also reviewed in light of the state's "Guidelines and Procedures for the…

  14. Ground-rupturing earthquakes on the northern Big Bend of the San Andreas Fault, California, 800 A.D. to Present

    Science.gov (United States)

    Scharer, Katherine M.; Weldon, Ray; Biasi, Glenn; Streig, Ashley; Fumal, Thomas E.

    2017-01-01

    Paleoseismic data on the timing of ground-rupturing earthquakes constrain the recurrence behavior of active faults and can provide insight on the rupture history of a fault if earthquakes dated at neighboring sites overlap in age and are considered correlative. This study presents the evidence and ages for 11 earthquakes that occurred along the Big Bend section of the southern San Andreas Fault at the Frazier Mountain paleoseismic site. The most recent earthquake to rupture the site was the Mw7.7–7.9 Fort Tejon earthquake of 1857. We use over 30 trench excavations to document the structural and sedimentological evolution of a small pull-apart basin that has been repeatedly faulted and folded by ground-rupturing earthquakes. A sedimentation rate of 0.4 cm/yr and abundant organic material for radiocarbon dating contribute to a record that is considered complete since 800 A.D. and includes 10 paleoearthquakes. Earthquakes have ruptured this location on average every ~100 years over the last 1200 years, but individual intervals range from ~22 to 186 years. The coefficient of variation of the length of time between earthquakes (0.7) indicates quasiperiodic behavior, similar to other sites along the southern San Andreas Fault. Comparison with the earthquake chronology at neighboring sites along the fault indicates that only one other 1857-size earthquake could have occurred since 1350 A.D., and since 800 A.D., the Big Bend and Mojave sections have ruptured together at most 50% of the time in Mw ≥ 7.3 earthquakes.

  15. Paleoseismologic evidence for large-magnitude (Mw 7.5-8.0) earthquakes on the Ventura blind thrust fault: Implications for multifault ruptures in the Transverse Ranges of southern California

    Science.gov (United States)

    McAuliffe, Lee J.; Dolan, James F.; Rhodes, Edward J.; Hubbard, Judith; Shaw, John H.; Pratt, Thomas L.

    2015-01-01

    Detailed analysis of continuously cored boreholes and cone penetrometer tests (CPTs), high-resolution seismic-reflection data, and luminescence and 14C dates from Holocene strata folded above the tip of the Ventura blind thrust fault constrain the ages and displacements of the two (or more) most recent earthquakes. These two earthquakes, which are identified by a prominent surface fold scarp and a stratigraphic sequence that thickens across an older buried fold scarp, occurred before the 235-yr-long historic era and after 805 ± 75 yr ago (most recent folding event[s]) and between 4065 and 4665 yr ago (previous folding event[s]). Minimum uplift in these two scarp-forming events was ∼6 m for the most recent earthquake(s) and ∼5.2 m for the previous event(s). Large uplifts such as these typically occur in large-magnitude earthquakes in the range of Mw7.5–8.0. Any such events along the Ventura fault would likely involve rupture of other Transverse Ranges faults to the east and west and/or rupture downward onto the deep, low-angle décollements that underlie these faults. The proximity of this large reverse-fault system to major population centers, including the greater Los Angeles region, and the potential for tsunami generation during ruptures extending offshore along the western parts of the system highlight the importance of understanding the complex behavior of these faults for probabilistic seismic hazard assessment.

  16. Three-dimensional seismic velocity models, high-precision earthquake locations and their implications for seismic, tectonic and magmatic settings in the Coso Geothermal Field, California

    Science.gov (United States)

    Zhang, Q.; Lin, G.

    2012-12-01

    The Coso Geothermal Field (CGF) lies at the east of Sierra Nevada and is situated in tectonically active area with the presence of hot spring, rhyolite domes at the surface, strike-slip and normal faulting and frequent seismic activity. In this study, we present our comprehensive analysis of three-dimensional velocity structure, high-precision earthquake relocation and in situ Vp/Vs estimates. We select 1,893 master events among 177,000 events between 1981 and 2011 recorded by the Southern California Seismic Network stations. High-resolution three-dimensional (3-D) Vp and Vp/Vs models in Coso are inverted from the master events with 52,160 P- and 23,688 S-wave first arrivals by using the SIMUL2000 algorithm. The tomographic model reveals slightly high Vp and Vp/Vs in most regions of Coso near the surface compared to the layers at depth of 6 and 12 km, which is consistent with the fact that the Coso area is filled with diorite and minor basalt. The feature of low Vp, low Vs and low Vp/Vs between 6 and 12 km depths underneath the CGF can be related to the porous, gas-filled rock or volatile-rich magma. The low Vp, low Vs and low Vp/Vs structure from the surface to 3 km depth beneath the Indian Wells Valley is consistent with the existence of the 2 km deep sediment strata revealed by the borehole data. The resulting new 3-D velocity model is used to improve the absolute event location accuracy. We then apply waveform cross-correlation, similar event cluster analysis and differential time relocation methods to improve relative event location accuracy with the horizontal and vertical location uncertainties in tens of meters. The relocated seismicity indicates that the brittle-ductile transition depth is as shallow as 5 km underneath the CGF. We also estimate in situ near-source Vp/Vs ratio within each event cluster using differential times from cross-correlation to complement the Vp/Vs model from tomographic inversions, which will help to estimate the volume fraction of

  17. Earthquake forecasting and its verification

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2005-01-01

    Full Text Available No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months. However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'' where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver operating characteristic (ROC diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.

  18. Lahar hazard zones for eruption-generated lahars in the Lassen Volcanic Center, California

    Science.gov (United States)

    Robinson, Joel E.; Clynne, Michael A.

    2012-01-01

    Lahar deposits are found in drainages that head on or near Lassen Peak in northern California, demonstrating that these valleys are susceptible to future lahars. In general, lahars are uncommon in the Lassen region. Lassen Peak's lack of large perennial snowfields and glaciers limits its potential for lahar development, with the winter snowpack being the largest source of water for lahar generation. The most extensive lahar deposits are related to the May 1915 eruption of Lassen Peak, and evidence for pre-1915 lahars is sparse and spatially limited. The May 1915 eruption of Lassen Peak was a small-volume eruption that generated a snow and hot-rock avalanche, a pyroclastic flow, and two large and four smaller lahars. The two large lahars were generated on May 19 and 22 and inundated sections of Lost and Hat Creeks. We use 80 years of snow depth measurements from Lassen Peak to calculate average and maximum liquid water depths, 2.02 meters (m) and 3.90 m respectively, for the month of May as estimates of the 1915 lahars. These depths are multiplied by the areal extents of the eruptive deposits to calculate a water volume range, 7.05-13.6x106 cubic meters (m3). We assume the lahars were a 50/50 mix of water and sediment and double the water volumes to provide an estimate of the 1915 lahars, 13.2-19.8x106 m3. We use a representative volume of 15x106 m3 in the software program LAHARZ to calculate cross-sectional and planimetric areas for the 1915 lahars. The resultant lahar inundation zone reasonably portrays both of the May 1915 lahars. We use this same technique to calculate the potential for future lahars in basins that head on or near Lassen Peak. LAHARZ assumes that the total lahar volume does not change after leaving the potential energy, H/L, cone (the height of the edifice, H, down to the approximate break in slope at its base, L); therefore, all water available to initiate a lahar is contained inside this cone. Because snow is the primary source of water for

  19. Postseismic relaxation following the 1992 M7.3 Landers and 1999 M7.1 Hector Mine earthquakes, southern California

    Science.gov (United States)

    Savage, J.C.; Svarc, J.L.

    2009-01-01

    Postseismic relaxation (measured postseismic deformation less the deformation that would have occurred at the preseismic rate) has been monitored at the same 10 monuments over ???6 years following both the 1992 Landers and the 1999 Hector Mine earthquakes. For both earthquakes the displacement components of the observed relaxation are well described by ??i + ??if1(t), where ??i and ??i are constants peculiar to each component at each monument, t is the time after the earthquake, and f1(t) is a temporal function common to all components and all monuments for that earthquake. The temporal fanction f1(t) can be approximated by bt + c loge(1 + t /??), where ?? = 38.7 ?? 15.2 days and 25.6 ?? 7.7 days for the Landers and Hector Mine relaxations, respectively. Because the estimated values of ?? do not differ significantly, the transient term loge(1 + t/??) in the temporal function may be the same for both earthquakes. The asymptotic (t ??? ???) relaxation rates ??ib are only a few mm/a and do not appear to be significantly different following the two earthquakes. The asymptotic deformation rates appear to be slightly greater than the preseismic deformation rates, but the preseismic rates were not measured directly. Thus, the deformations of the Landers array measured over the first 5.6 years following the Landers earthquake and over the first 6.4 years following the Hector Mine earthquake are generally consistent with a simple model in which the transient postearthquake relaxation depends upon time as loge(1 + t/??) with ?? = 28 ?? 5 days and the asymptotic postseismic speeds of the monuments exceed the preseismic speeds by at most only a few millimeters per annum.

  20. Positron--electron storage ring project: Stanford Linear Accelerator Center, Stanford, California. Final environmental statement

    International Nuclear Information System (INIS)

    1976-08-01

    A final environmental statement is given which was prepared in compliance with the National Environmental Policy Act to support the Energy Research and Development Administration project to design and construct the positron-electron colliding beam storage ring (PEP) facilities at the Stanford Linear Accelerator Center (SLAC). The PEP storage ring will be constructed underground adjacent to the existing two-mile long SLAC particle accelerator to utilize its beam. The ring will be about 700 meters in diameter, buried at depths of 20 to 100 feet, and located at the eastern extremity of the SLAC site. Positron and electron beams will collide in the storage ring to provide higher energies and hence higher particle velocities than have been heretofore achieved. Some of the energy from the collisions is transformed back into matter and produces a variety of particles of immense interest to physicists. The environmental impacts during the estimated two and one-half years construction period will consist of movement of an estimated 320,000 cubic yards of earth and the creation of some rubble, refuse, and dust and noise which will be kept to a practical minimum through planned construction procedures. The terrain will be restored to very nearly its original conditions. Normal operation of the storage ring facility will not produce significant adverse environmental effects different from operation of the existing facilities and the addition of one water cooling tower. No overall increase in SLAC staff is anticipated for operation of the facility. Alternatives to the proposed project that were considered include: termination, postponement, other locations and construction of a conventional high energy accelerator

  1. Using focal mechanism solutions to correlate earthquakes with faults in the Lake Tahoe-Truckee area, California and Nevada, and to help design LiDAR surveys for active-fault reconnaissance

    Science.gov (United States)

    Cronin, V. S.; Lindsay, R. D.

    2011-12-01

    Geomorphic analysis of hillshade images produced from aerial LiDAR data has been successful in identifying youthful fault traces. For example, the recently discovered Polaris fault just northwest of Lake Tahoe, California/Nevada, was recognized using LiDAR data that had been acquired by local government to assist land-use planning. Subsequent trenching by consultants under contract to the US Army Corps of Engineers has demonstrated Holocene displacement. The Polaris fault is inferred to be capable of generating a magnitude 6.4-6.9 earthquake, based on its apparent length and offset characteristics (Hunter and others, 2011, BSSA 101[3], 1162-1181). Dingler and others (2009, GSA Bull 121[7/8], 1089-1107) describe paleoseismic or geomorphic evidence for late Neogene displacement along other faults in the area, including the West Tahoe-Dollar Point, Stateline-North Tahoe, and Incline Village faults. We have used the seismo-lineament analysis method (SLAM; Cronin and others, 2008, Env Eng Geol 14[3], 199-219) to establish a tentative spatial correlation between each of the previously mentioned faults, as well as with segments of the Dog Valley fault system, and one or more earthquake(s). The ~18 earthquakes we have tentatively correlated with faults in the Tahoe-Truckee area occurred between 1966 and 2008, with magnitudes between 3 and ~6. Given the focal mechanism solution for a well-located shallow-focus earthquake, the nodal planes can be projected to Earth's surface as represented by a DEM, plus-or-minus the vertical and horizontal uncertainty in the focal location, to yield two seismo-lineament swaths. The trace of the fault that generated the earthquake is likely to be found within one of the two swaths [1] if the fault surface is emergent, and [2] if the fault surface is approximately planar in the vicinity of the focus. Seismo-lineaments from several of the earthquakes studied overlap in a manner that suggests they are associated with the same fault. The surface

  2. Geotechnical Extreme Events Reconnaissance Report on the Performance of Structures in Densely Urbanized Areas Affected by Surface Fault Rupture During the August 24, 2014 M6 South Napa Earthquake, California, USA.

    Science.gov (United States)

    Cohen-Waeber, J.; Lanzafame, R.; Bray, J.; Sitar, N.

    2014-12-01

    The August 24, 2014, M­w 6.0 South Napa earthquake is the largest seismic event to have occurred in the San Francisco Bay Region, California, USA, since the Mw 6.9 1989 Loma Prieta earthquake. The event epicenter occurred at the South end of the Napa Valley, California, principally rupturing northwest along parts of the active West Napa fault zone. Bound by two major fault zones to the East and West (Calaveras and Rogers Creek, respectively), the Napa Valley is filled with up to 170 m. of alluvial deposits and is considered to be moderately to very highly susceptible to liquefaction and has the potential for violent shaking. While damage due to strong ground shaking was significant, remarkably little damage due to liquefaction or landslide induced ground deformations was observed. This may be due to recent drought in the region. Instead, the South Napa earthquake is the first to produce significant surface rupture in this area since the Mw 7.9 1906 San Andreas event, and the first in Northern California to rupture through a densely urbanized environment. Clear expressions of surface fault rupture extended approximately 12 - 15 km northward from the epicenter and approximately 1-2 km southeast with a significant impact to infrastructure, including roads, lifelines and residential structures. The National Science Foundation funded Geotechnical Extreme Events Reconnaissance (GEER) Association presents here its observations on the performance of structures affected by surface fault rupture, in a densely populated residential neighborhood located approximately 10 km north of the epicenter. Based on the detailed mapping of 27 residential structures, a preliminary assessment of the quantitative descriptions of damage shows certain characteristic interactions between surface fault rupture and the overlying infrastructure: 48% of concrete slabs cracked up to 8 cm wide, 19% of structures shifted up to 11 cm off of their foundation and 44% of foundations cracked up to 3 cm

  3. National Training Center-Fort Irwin, California. Native American Consultation Meeting at Fort Mojave, Nevada, Held on 2-3 October 2003

    Science.gov (United States)

    2004-08-01

    National Training Center at Fort Irwin is located 37 miles northeast of Bar - stow, California, and is a U.S. Army installation (see Figure 1). The base is...MR. RAYE : George Raye of the Colorado 5 River Indian Tribe. 6 MR. BILL SMITH: Bill Smith, I’m a member 7 of...construction, barring any unforseen findings out there, 3 we anticipate starting this construction in the spring 4 of 󈧈, and probably about

  4. The 2014 Mw6.1 South Napa Earthquake: A unilateral rupture with shallow asperity and rapid afterslip

    Science.gov (United States)

    Wei, Shengji; Barbot, Sylvain; Graves, Robert; Lienkaemper, James J.; Wang, Teng; Hudnut, Kenneth W.; Fu, Yuning; Helmberger, Don

    2015-01-01

    The Mw6.1 South Napa earthquake occurred near Napa, California on August 24, 2014 (UTC), and was the largest inland earthquake in Northern California since the 1989 Mw6.9 Loma Prieta earthquake. The first report of the earthquake from the Northern California Earthquake Data Center (NCEDC) indicates a hypocentral depth of 11.0km with longitude and latitude of (122.3105°W, 38.217°N). Surface rupture was documented by field observations and Lidar imaging (Brooks et al. 2014; Hudnut et al. 2014; Brocher et al., 2015), with about 12 km of continuous rupture starting near the epicenter and extending to the northwest. The southern part of the rupture is relatively straight, but the strike changes by about 15° at the northern end over a 6-km segment. The peak dextral offset was observed near the Buhman residence with right-.‐lateral motion of 46 cm, near the location where the strike of fault begins to rotate clock-.‐wise (Hudnut et al., 2014). The earthquake was well recorded by the strong motion network operated by the NCEDC, the California Geological Survey and the U.S. Geological Survey (USGS). There are about 12 sites within an epicentral distance of 15km, with relatively good azimuthal coverage (Fig.1). The largest peak-ground-velocity (PGV) of nearly 100 cm/s was observed on station 1765, which is the closest station to the rupture and lies about 3 km east of the northern segment (Fig. 1). The ground deformation associated with the earthquake was also well recorded by the high-resolution COSMO-SkyMed satellite and Sentinel-1A satellite, providing independent static observations.

  5. Seismology program; California Division of Mines and Geology

    Science.gov (United States)

    Sherburne, R. W.

    1981-01-01

    The year 1980 marked the centennial of the California Division of Mines and Geology (CDMG) and a decade of the Division's involvement in seismology. Factors which contributed to the formation of a Seismology Group within CDMG included increased concerns for environmental and earthquake safety, interest in earthquake prediction, the 1971 San Fernando earthquake and the 1973 publication by CDMG of an urban geology master plan for California. Reasons to be concerned about California's earthquake problem are demonstrated by the accompanying table and the figures. Recent seismicity in California, the Southern California uplift reflecting changes in crustal strain, and other possible earthquake precursors have heightened concern among scientific and governmental groups about the possible occurrence of a major damaging earthquake )M>7) in California

  6. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    Science.gov (United States)

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  7. Rapid finite-fault inversions in Southern California using Cybershake Green's functions

    Science.gov (United States)

    Thio, H. K.; Polet, J.

    2017-12-01

    We have developed a system for rapid finite fault inversion for intermediate and large Southern California earthquakes using local, regional and teleseismic seismic waveforms as well as geodetic data. For modeling the local seismic data, we use 3D Green's functions from the Cybershake project, which were made available to us courtesy of the Southern California Earthquake Center (SCEC). The use of 3D Green's functions allows us to extend the inversion to higher frequency waveform data and smaller magnitude earthquakes, in addition to achieving improved solutions in general. The ultimate aim of this work is to develop the ability to provide high quality finite fault models within a few hours after any damaging earthquake in Southern California, so that they may be used as input to various post-earthquake assessment tools such as ShakeMap, as well as by the scientific community and other interested parties. Additionally, a systematic determination of finite fault models has value as a resource for scientific studies on detailed earthquake processes, such as rupture dynamics and scaling relations. We are using an established least-squares finite fault inversion method that has been applied extensively both on large as well as smaller regional earthquakes, in conjunction with the 3D Green's functions, where available, as well as 1D Green's functions for areas for which the Cybershake library has not yet been developed. We are carrying out validation and calibration of this system using significant earthquakes that have occurred in the region over the last two decades, spanning a range of locations and magnitudes (5.4 and higher).

  8. Evaluation of Routine HIV Opt-Out Screening and Continuum of Care Services Following Entry into Eight Prison Reception Centers--California, 2012.

    Science.gov (United States)

    Lucas, Kimberley D; Eckert, Valorie; Behrends, Czarina N; Wheeler, Charlotte; MacGowan, Robin J; Mohle-Boetani, Janet C

    2016-02-26

    Early diagnosis of human immunodeficiency virus (HIV) infection and initiation of antiretroviral treatment (ART) improves health outcomes and prevents HIV transmission. Before 2010, HIV testing was available to inmates in the California state prison system upon request. In 2010, the California Correctional Health Care Services (CCHCS) integrated HIV opt-out screening into the health assessment for inmates entering California state prisons. Under this system, a medical care provider informs the inmate that an HIV test is routinely done, along with screening for sexually transmitted, communicable, and vaccine-preventable diseases, unless the inmate specifically declines the test. During 2012-2013, CCHCS, the California Department of Public Health, and CDC evaluated HIV screening, rates of new diagnoses, linkage to and retention in care, ART response, and post-release linkage to care among California prison inmates. All prison inmates are processed through one of eight specialized reception center facilities, where they undergo a comprehensive evaluation of their medical needs, mental health, and custody requirements for placement in one of 35 state prisons. Among 17,436 inmates who entered a reception center during April-September 2012, 77% were screened for HIV infection; 135 (1%) tested positive, including 10 (0.1%) with newly diagnosed infections. Among the 135 HIV-positive patient-inmates, 134 (99%) were linked to care within 90 days of diagnosis, including 122 (91%) who initiated ART. Among 83 who initiated ART and remained incarcerated through July 2013, 81 (98%) continued ART; 71 (88%) achieved viral suppression (prison, continuity of care in the community remains a challenge. An infrastructure for post-release linkage to care is needed to help ensure sustained HIV disease control.

  9. S-wave triggering of tremor beneath the Parkfield, California, section of the San Andreas fault by the 2011 Tohoku, Japan earthquake: observations and theory

    Science.gov (United States)

    Hill, David P.; Peng, Zhigang; Shelly, David R.; Aiken, Chastity

    2013-01-01

    The dynamic stresses that are associated with the energetic seismic waves generated by the Mw 9.0 Tohoku earthquake off the northeast coast of Japan triggered bursts of tectonic tremor beneath the Parkfield section of the San Andreas fault (SAF) at an epicentral distance of ∼8200  km. The onset of tremor begins midway through the ∼100‐s‐period S‐wave arrival, with a minor burst coinciding with the SHSH arrival, as recorded on the nearby broadband seismic station PKD. A more pronounced burst coincides with the Love arrival, followed by a series of impulsive tremor bursts apparently modulated by the 20‐ to 30‐s‐period Rayleigh wave. The triggered tremor was located at depths between 20 and 30 km beneath the surface trace of the fault, with the burst coincident with the S wave centered beneath the fault 30 km northwest of Parkfield. Most of the subsequent activity, including the tremor coincident with the SHSH arrival, was concentrated beneath a stretch of the fault extending from 10 to 40 km southeast of Parkfield. The seismic waves from the Tohoku epicenter form a horizontal incidence angle of ∼14°, with respect to the local strike of the SAF. Computed peak dynamic Coulomb stresses on the fault at tremor depths are in the 0.7–10 kPa range. The apparent modulation of tremor bursts by the small, strike‐parallel Rayleigh‐wave stresses (∼0.7  kPa) is likely enabled by pore pressure variations driven by the Rayleigh‐wave dilatational stress. These results are consistent with the strike‐parallel dynamic stresses (δτs) associated with the S, SHSH, and surface‐wave phases triggering small increments of dextral slip on the fault with a low friction (μ∼0.2). The vertical dynamic stresses δτd do not trigger tremor with vertical or oblique slip under this simple Coulomb failure model.

  10. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  11. Earthquakes, July-August 1992

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤Mearthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  12. The HayWired earthquake scenario—Earthquake hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  13. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  14. The Earthquake Preparedness Task Force Report. Revised.

    Science.gov (United States)

    Roybal-Allard, Lucille

    A report on Earthquake Preparedness presents California school districts with direction for complying with existing earthquake preparedness planning laws. It first contains two sets of recommendations. The first set requires state action and is presented to the Legislature for consideration. The second set consists of policy statements and…

  15. The SCEC/USGS dynamic earthquake rupture code verification exercise

    Science.gov (United States)

    Harris, R.A.; Barall, M.; Archuleta, R.; Dunham, E.; Aagaard, Brad T.; Ampuero, J.-P.; Bhat, H.; Cruz-Atienza, Victor M.; Dalguer, L.; Dawson, P.; Day, S.; Duan, B.; Ely, G.; Kaneko, Y.; Kase, Y.; Lapusta, N.; Liu, Yajing; Ma, S.; Oglesby, D.; Olsen, K.; Pitarka, A.; Song, S.; Templeton, E.

    2009-01-01

    Numerical simulations of earthquake rupture dynamics are now common, yet it has been difficult to test the validity of these simulations because there have been few field observations and no analytic solutions with which to compare the results. This paper describes the Southern California Earthquake Center/U.S. Geological Survey (SCEC/USGS) Dynamic Earthquake Rupture Code Verification Exercise, where codes that simulate spontaneous rupture dynamics in three dimensions are evaluated and the results produced by these codes are compared using Web-based tools. This is the first time that a broad and rigorous examination of numerous spontaneous rupture codes has been performed—a significant advance in this science. The automated process developed to attain this achievement provides for a future where testing of codes is easily accomplished.Scientists who use computer simulations to understand earthquakes utilize a range of techniques. Most of these assume that earthquakes are caused by slip at depth on faults in the Earth, but hereafter the strategies vary. Among the methods used in earthquake mechanics studies are kinematic approaches and dynamic approaches.The kinematic approach uses a computer code that prescribes the spatial and temporal evolution of slip on the causative fault (or faults). These types of simulations are very helpful, especially since they can be used in seismic data inversions to relate the ground motions recorded in the field to slip on the fault(s) at depth. However, these kinematic solutions generally provide no insight into the physics driving the fault slip or information about why the involved fault(s) slipped that much (or that little). In other words, these kinematic solutions may lack information about the physical dynamics of earthquake rupture that will be most helpful in forecasting future events.To help address this issue, some researchers use computer codes to numerically simulate earthquakes and construct dynamic, spontaneous

  16. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  17. California's Vulnerability to Volcanic Hazards: What's at Risk?

    Science.gov (United States)

    Mangan, M.; Wood, N. J.; Dinitz, L.

    2015-12-01

    California is a leader in comprehensive planning for devastating earthquakes, landslides, floods, and tsunamis. Far less attention, however, has focused on the potentially devastating impact of volcanic eruptions, despite the fact that they occur in the State about as frequently as the largest earthquakes on the San Andreas Fault Zone. At least 10 eruptions have occurred in the past 1,000 years—most recently in northern California (Lassen Peak 1914 to 1917)—and future volcanic eruptions are inevitable. The likelihood of renewed volcanism in California is about one in a few hundred to one in a few thousand annually. Eight young volcanoes, ranked as Moderate to Very High Threat [1] are dispersed throughout the State. Partially molten rock (magma) resides beneath at least seven of these—Medicine Lake Volcano, Mount Shasta, Lassen Volcanic Center, Clear Lake Volcanic Field, Long Valley Volcanic Region, Coso Volcanic Field, and Salton Buttes— causing earthquakes, toxic gas emissions, hydrothermal activity, and (or) ground deformation. Understanding the hazards and identifying what is at risk are the first steps in building community resilience to volcanic disasters. This study, prepared in collaboration with the State of California Governor's Office of Emergency Management and the California Geological Survey, provides a broad perspective on the State's exposure to volcano hazards by integrating mapped volcano hazard zones with geospatial data on at-risk populations, infrastructure, and resources. The study reveals that ~ 16 million acres fall within California's volcano hazard zones, along with ~ 190 thousand permanent and 22 million transitory populations. Additionally, far-field disruption to key water delivery systems, agriculture, utilities, and air traffic is likely. Further site- and sector-specific analyses will lead to improved hazard mitigation efforts and more effective disaster response and recovery. [1] "Volcanic Threat and Monitoring Capabilities

  18. Earthquake Preparedness 101: Planning Guidelines for Colleges and Universities.

    Science.gov (United States)

    California Governor's Office, Sacramento.

    This publication is a guide for California colleges and universities wishing to prepare for earthquakes. An introduction aimed at institutional leaders emphasizes that earthquake preparedness is required by law and argues that there is much that can be done to prepare for earthquakes. The second section, addressed to the disaster planner, offers…

  19. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  20. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  1. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    Science.gov (United States)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  2. Simulation of Ground-Water Flow in the Irwin Basin Aquifer System, Fort Irwin National Training Center, California

    Science.gov (United States)

    Densmore, Jill N.

    2003-01-01

    Ground-water pumping in the Irwin Basin at Fort Irwin National Training Center, California resulted in water-level declines of about 30 feet from 1941 to 1996. Since 1992, artificial recharge from wastewater-effluent infiltration and irrigation-return flow has stabilized water levels, but there is concern that future water demands associated with expansion of the base may cause a resumption of water-level declines. To address these concerns, a ground-water flow model of the Irwin Basin was developed to help better understand the aquifer system, assess the long-term availability and quality of ground water, and evaluate ground-water conditions owing to current pumping and to plan for future water needs at the base. Historical data show that ground-water-level declines in the Irwin Basin between 1941 and 1996, caused the formation of a pumping depression near the pumped wells, and that recharge from the wastewater-treatment facility and disposal area caused the formation of a recharge mound. There have been two periods of water-level recovery in the Irwin Basin since the development of ground water in this basin; these periods coincide with a period of decreased pumpage from the basin and a period of increased recharge of water imported from the Bicycle Basin beginning in 1967 and from the Langford Basin beginning in 1992. Since 1992, artificial recharge has exceeded pumpage in the Irwin Basin and has stabilized water-level declines. A two-layer ground-water flow model was developed to help better understand the aquifer system, assess the long-term availability and quality of ground water, and evaluate ground-water conditions owing to current pumping and to plan for future water needs at the base. Boundary conditions, hydraulic conductivity, altitude of the bottom of the layers, vertical conductance, storage coefficient, recharge, and discharge were determined using existing geohydrologic data. Rates and distribution of recharge and discharge were determined from

  3. Hydrogen isotope investigation of amphibole and biotite phenocrysts in silicic magmas erupted at Lassen Volcanic Center, California

    Science.gov (United States)

    Underwood, S.J.; Feeley, T.C.; Clynne, M.A.

    2012-01-01

    Hydrogen isotope ratio, water content and Fe3 +/Fe2 + in coexisting amphibole and biotite phenocrysts in volcanic rocks can provide insight into shallow pre- and syn-eruptive magmatic processes such as vesiculation, and lava drainback with mixing into less devolatilized magma that erupts later in a volcanic sequence. We studied four ~ 35 ka and younger eruption sequences (i.e. Kings Creek, Lassen Peak, Chaos Crags, and 1915) at the Lassen Volcanic Center (LVC), California, where intrusion of crystal-rich silicic magma mushes by mafic magmas is inferred from the varying abundances of mafic magmatic inclusions (MMIs) in the silicic volcanic rocks. Types and relative proportions of reacted and unreacted hydrous phenocryst populations are evaluated with accompanying chemical and H isotope changes. Biotite phenocrysts were more susceptible to rehydration in older vesicular glassy volcanic rocks than coexisting amphibole phenocrysts. Biotite and magnesiohornblende phenocrysts toward the core of the Lassen Peak dome are extensively dehydroxylated and reacted from prolonged exposure to high temperature, low pressure, and higher fO2 conditions from post-emplacement cooling. In silicic volcanic rocks not affected by alteration, biotite phenocrysts are often relatively more dehydroxylated than are magnesiohornblende phenocrysts of similar size; this is likely due to the ca 10 times larger overall bulk H diffusion coefficient in biotite. A simplified model of dehydrogenation in hydrous phenocrysts above reaction closure temperature suggests that eruption and quench of magma ascended to the surface in a few hours is too short a time for substantial H loss from amphibole. In contrast, slowly ascended magma can have extremely dehydrogenated and possibly dehydrated biotite, relatively less dehydrogenated magnesiohornblende and reaction rims on both phases. Eruptive products containing the highest proportions of mottled dehydrogenated crystals could indicate that within a few days

  4. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  5. Earthquake Facts

    Science.gov (United States)

    ... estimated 830,000 people. In 1976 another deadly earthquake struck in Tangshan, China, where more than 250,000 people were killed. Florida and North Dakota have the smallest number of earthquakes in the United States. The deepest earthquakes typically ...

  6. Fluid-faulting evolution in high definition: Connecting fault structure and frequency-magnitude variations during the 2014 Long Valley Caldera, California earthquake swarm

    Science.gov (United States)

    Shelly, David R.; Ellsworth, William L.; Hill, David P.

    2016-01-01

    An extended earthquake swarm occurred beneath southeastern Long Valley Caldera between May and November 2014, culminating in three magnitude 3.5 earthquakes and 1145 cataloged events on 26 September alone. The swarm produced the most prolific seismicity in the caldera since a major unrest episode in 1997-1998. To gain insight into the physics controlling swarm evolution, we used large-scale cross-correlation between waveforms of cataloged earthquakes and continuous data, producing precise locations for 8494 events, more than 2.5 times the routine catalog. We also estimated magnitudes for 18,634 events (~5.5 times the routine catalog), using a principal component fit to measure waveform amplitudes relative to cataloged events. This expanded and relocated catalog reveals multiple episodes of pronounced hypocenter expansion and migration on a collection of neighboring faults. Given the rapid migration and alignment of hypocenters on narrow faults, we infer that activity was initiated and sustained by an evolving fluid pressure transient with a low-viscosity fluid, likely composed primarily of water and CO2 exsolved from underlying magma. Although both updip and downdip migration were observed within the swarm, downdip activity ceased shortly after activation, while updip activity persisted for weeks at moderate levels. Strongly migrating, single-fault episodes within the larger swarm exhibited a higher proportion of larger earthquakes (lower Gutenberg-Richter b value), which may have been facilitated by fluid pressure confined in two dimensions within the fault zone. In contrast, the later swarm activity occurred on an increasingly diffuse collection of smaller faults, with a much higher b value.

  7. Fault-Zone Maturity Defines Maximum Earthquake Magnitude: The case of the North Anatolian Fault Zone

    Science.gov (United States)

    Bohnhoff, Marco; Bulut, Fatih; Stierle, Eva; Martinez-Garzon, Patricia; Benzion, Yehuda

    2015-04-01

    Estimating the maximum likely magnitude of future earthquakes on transform faults near large metropolitan areas has fundamental consequences for the expected hazard. Here we show that the maximum earthquakes on different sections of the North Anatolian Fault Zone (NAFZ) scale with the duration of fault zone activity, cumulative offset and length of individual fault segments. The findings are based on a compiled catalogue of historical earthquakes in the region, using the extensive literary sources that exist due to the long civilization record. We find that the largest earthquakes (M~8) are exclusively observed along the well-developed part of the fault zone in the east. In contrast, the western part is still in a juvenile or transitional stage with historical earthquakes not exceeding M=7.4. This limits the current seismic hazard to NW Turkey and its largest regional population and economical center Istanbul. Our findings for the NAFZ are consistent with data from the two other major transform faults, the San Andreas fault in California and the Dead Sea Transform in the Middle East. The results indicate that maximum earthquake magnitudes generally scale with fault-zone evolution.

  8. USC/School Performance Dashboard 2013. A Report from the Center on Educational Governance/University of Southern California

    Science.gov (United States)

    Center on Educational Governance, 2013

    2013-01-01

    The USC School Performance Dashboard, now in its seventh year, draws on California school data from 2003-2012 to rate charter schools on academic and financial measures of performance. It also provides an accompanying interactive site at www.uscrossier.org/ceg/. Unlike other school databases, this one assigns values--high, medium, low--to the…

  9. Detecting Significant Stress Drop Variations in Large Micro-Earthquake Datasets: A Comparison Between a Convergent Step-Over in the San Andreas Fault and the Ventura Thrust Fault System, Southern California

    Science.gov (United States)

    Goebel, T. H. W.; Hauksson, E.; Plesch, A.; Shaw, J. H.

    2017-06-01

    A key parameter in engineering seismology and earthquake physics is seismic stress drop, which describes the relative amount of high-frequency energy radiation at the source. To identify regions with potentially significant stress drop variations, we perform a comparative analysis of source parameters in the greater San Gorgonio Pass (SGP) and Ventura basin (VB) in southern California. The identification of physical stress drop variations is complicated by large data scatter as a result of attenuation, limited recording bandwidth and imprecise modeling assumptions. In light of the inherently high uncertainties in single stress drop measurements, we follow the strategy of stacking large numbers of source spectra thereby enhancing the resolution of our method. We analyze more than 6000 high-quality waveforms between 2000 and 2014, and compute seismic moments, corner frequencies and stress drops. Significant variations in stress drop estimates exist within the SGP area. Moreover, the SGP also exhibits systematically higher stress drops than VB and shows more scatter. We demonstrate that the higher scatter in SGP is not a generic artifact of our method but an expression of differences in underlying source processes. Our results suggest that higher differential stresses, which can be deduced from larger focal depth and more thrust faulting, may only be of secondary importance for stress drop variations. Instead, the general degree of stress field heterogeneity and strain localization may influence stress drops more strongly, so that more localized faulting and homogeneous stress fields favor lower stress drops. In addition, higher loading rates, for example, across the VB potentially result in stress drop reduction whereas slow loading rates on local fault segments within the SGP region result in anomalously high stress drop estimates. Our results show that crustal and fault properties systematically influence earthquake stress drops of small and large events and should

  10. Earthquake Preparedness and Education: A Collective Impact Approach to Improving Awareness and Resiliency

    Science.gov (United States)

    Benthien, M. L.; Wood, M. M.; Ballmann, J. E.; DeGroot, R. M.

    2017-12-01

    The Southern California Earthquake Center (SCEC), headquartered at the University of Southern California, is a collaboration of more than 1000 scientists and students from 70+ institutions. SCEC's Communication, Education, and Outreach (CEO) program translates earthquake science into products and activities in order to increase scientific literacy, develop a diverse scientific workforce, and reduce earthquake risk to life and property. SCEC CEO staff coordinate these efforts through partnership collaborations it has established to engage subject matter experts, reduce duplication of effort, and achieve greater results. Several of SCEC's collaborative networks began within Southern California and have since grown statewide (Earthquake Country Alliance, a public-private-grassroots partnership), national ("EPIcenter" Network of museums, parks, libraries, etc.), and international (Great ShakeOut Earthquake Drills with millions of participants each year). These networks have benefitted greatly from partnerships with national (FEMA), state, and local emergency managers. Other activities leverage SCEC's networks in new ways and with national earth science organizations, such as the EarthConnections Program (with IRIS, NAGT, and many others), Quake Catcher Network (with IRIS) and the GeoHazards Messaging Collaboratory (with IRIS, UNAVCO, and USGS). Each of these partnerships share a commitment to service, collaborative development, and the application of research (including social science theory for motivating preparedness behaviors). SCEC CEO is developing new evaluative structures and adapting the Collective Impact framework to better understand what has worked well or what can be improved, according to the framework's five key elements: create a common agenda; share common indicators and measurement; engage diverse stakeholders to coordinate mutually reinforcing activities; initiate continuous communication; and provide "backbone" support. This presentation will provide

  11. Instrumental shaking thresholds for seismically induced landslides and preliminary report on landslides triggered by the October 17, 1989, Loma Prieta, California earthquake

    Science.gov (United States)

    Harp, E.L.

    1993-01-01

    The generation of seismically induced landslide depends on the characteristics of shaking as well as mechanical properties of geologic materials. A very important parameter in the study of seismically induced landslide is the intensity based on a strong-motion accelerogram: it is defined as Arias intensity and is proportional to the duration of the shaking record as well as the amplitude. Having a theoretical relationship between Arias intensity, magnitude and distance it is possible to predict how far away from the seismic source landslides are likely to occur for a given magnitude earthquake. Field investigations have established that the threshold level of Arias intensity depends also on site effects, particularly the fracture characteristics of the outcrops present. -from Author

  12. Earthquake Forecasting System in Italy

    Science.gov (United States)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  13. Workshop targets development of geodetic transient detection methods: 2009 SCEC Annual Meeting: Workshop on transient anomalous strain detection; Palm Springs, California, 12-13 September 2009

    Science.gov (United States)

    Murray-Moraleda, Jessica R.; Lohman, Rowena

    2010-01-01

    The Southern California Earthquake Center (SCEC) is a community of researchers at institutions worldwide working to improve understanding of earthquakes and mitigate earthquake risk. One of SCEC's priority objectives is to “develop a geodetic network processing system that will detect anomalous strain transients.” Given the growing number of continuously recording geodetic networks consisting of hundreds of stations, an automated means for systematically searching data for transient signals, especially in near real time, is critical for network operations, hazard monitoring, and event response. The SCEC Transient Detection Test Exercise began in 2008 to foster an active community of researchers working on this problem, explore promising methods, and combine effective approaches in novel ways. A workshop was held in California to assess what has been learned thus far and discuss areas of focus as the project moves forward.

  14. Impact of the Northridge earthquake on the mental health of veterans: results from a panel study.

    Science.gov (United States)

    Dobalian, Aram; Stein, Judith A; Heslin, Kevin C; Riopelle, Deborah; Venkatesh, Brinda; Lanto, Andrew B; Simon, Barbara; Yano, Elizabeth M; Rubenstein, Lisa V

    2011-09-01

    The 1994 earthquake that struck Northridge, California, led to the closure of the Veterans Health Administration Medical Center at Sepulveda. This article examines the earthquake's impact on the mental health of an existing cohort of veterans who had previously used the Sepulveda Veterans Health Administration Medical Center. From 1 to 3 months after the disaster, trained interviewers made repeated attempts to contact participants by telephone to administer a repeated measures follow-up design survey based on a survey that had been done preearthquake. Postearthquake data were obtained on 1144 of 1800 (64%) male veterans for whom there were previous data. We tested a predictive latent variable path model of the relations between sociodemographic characteristics, predisaster physical and emotional health measures, and postdisaster emotional health and perceived earthquake impact. Perceived earthquake impact was predicted by predisaster emotional distress, functional limitations, and number of health conditions. Postdisaster emotional distress was predicted by preexisting emotional distress and earthquake impact. The regression coefficient from earthquake impact to postearthquake emotional distress was larger than that of the stability coefficient from preearthquake emotional distress. Postearthquake emotional distress also was affected indirectly by preearthquake emotional distress, health conditions, younger age, and lower socioeconomic status. The postdisaster emotional health of veterans who experienced greater earthquake impact would have likely benefited from postdisaster intervention, regardless of their predisaster emotional health. Younger veterans and veterans with generally poor physical and emotional health were more vulnerable to greater postearthquake emotional distress. Veterans of lower socioeconomic status were disproportionately likely to experience more effects of the disaster because they had more predisaster emotional distress, more functional

  15. Undead earthquakes

    Science.gov (United States)

    Musson, R. M. W.

    This short communication deals with the problem of fake earthquakes that keep returning into circulation. The particular events discussed are some very early earthquakes supposed to have occurred in the U.K., which all originate from a single enigmatic 18th century source.

  16. Recovery Act: Federspiel Controls (now Vigilent) and State of California Department of General Services Data Center Energy Efficient Cooling Control Demonstration. Final technical project report

    Energy Technology Data Exchange (ETDEWEB)

    Federspiel, Clifford; Evers, Myah

    2011-09-30

    Eight State of California data centers were equipped with an intelligent energy management system to evaluate the effectiveness, energy savings, dollar savings and benefits that arise when powerful artificial intelligence-based technology measures, monitors and actively controls cooling operations. Control software, wireless sensors and mesh networks were used at all sites. Most sites used variable frequency drives as well. The system dynamically adjusts temperature and airflow on the fly by analyzing real-time demands, thermal behavior and historical data collected on site. Taking into account the chaotic interrelationships of hundreds to thousands of variables in a data center, the system optimizes the temperature distribution across a facility while also intelligently balancing loads, outputs, and airflow. The overall project will provide a reduction in energy consumption of more than 2.3 million kWh each year, which translates to $240,000 saved and a reduction of 1.58 million pounds of carbon emissions. Across all sites, the cooling energy consumption was reduced by 41%. The average reduction in energy savings across all the sites that use VFDs is higher at 58%. Before this case study, all eight data centers ran the cooling fans at 100% capacity all of the time. Because of the new technology, cooling fans run at the optimum fan speed maintaining stable air equilibrium while also expending the least amount of electricity. With lower fan speeds, the life of the capital investment made on cooling equipment improves, and the cooling capacity of the data center increases. This case study depicts a rare technological feat: The same process and technology worked cost effectively in eight very different environments. The results show that savings were achieved in centers with diverse specifications for the sizes, ages and types of cooling equipment. The percentage of cooling energy reduction ranged from 19% to 78% while keeping temperatures substantially within the

  17. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  18. California sea lion and northern fur seal censuses conducted at Channel Islands, California by Alaska Fisheries Science Center from 1969-07-31 to 2015-08-08 (NCEI Accession 0145165)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Marine Mammal Laboratories' California Current Ecosystem Program (AFSC/NOAA) initiated and maintains census programs for California sea lions (Zalophus...

  19. Survival and natality rate observations of California sea lions at San Miguel Island, California conducted by Alaska Fisheries Science Center, National Marine Mammal Laboratory from 1987-09-20 to 2014-09-25 (NCEI Accession 0145167)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The dataset contains initial capture and marking data for California sea lion (Zalophus californianus) pups at San Miguel Island, California and subsequent...

  20. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  1. California Environmental Vulnerability Assessment (CEVA) Score, San Joaquin Valley CA, 2013, UC Davis Center for Regional Change

    Data.gov (United States)

    U.S. Environmental Protection Agency — This data set is based on a three year study by the UC Davis Center for Regional Change, in affiliation with the Environmental Justice Project of the John Muir...

  2. Assessing the Impact of School-Based Health Centers on Academic Achievement and College Preparation Efforts: Using Propensity Score Matching to Assess School-Level Data in California.

    Science.gov (United States)

    Bersamin, Melina; Garbers, Samantha; Gaarde, Jenna; Santelli, John

    2016-08-01

    This study examines the association between school-based health center (SBHC) presence and school-wide measures of academic achievement and college preparation efforts. Publicly available educational and demographic data from 810 California public high schools were linked to a list of schools with an SBHC. Propensity score matching, a method to reduce bias inherent in nonrandomized control studies, was used to select comparison schools. Regression analyses, controlling for proportion of English-language learners, were conducted for each outcome including proportion of students participating in three College Board exams, graduation rates, and meeting university graduation requirements. Findings suggest that SBHC presence is positively associated with college preparation outcomes but not with academic achievement outcomes (graduation rates or meeting state graduation requirements). Future research must examine underlying mechanisms supporting this association, such as school connectedness. Additional research should explore the role that SBHC staff could have in supporting college preparation efforts. © The Author(s) 2016.

  3. Using earthquake intensities to forecast earthquake occurrence times

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2006-01-01

    Full Text Available It is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world.

  4. California Ocean Uses Atlas

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  5. University of Southern California

    Data.gov (United States)

    Federal Laboratory Consortium — The focus of the University of Southern California (USC) Children''s Environmental Health Center is to develop a better understanding of how host susceptibility and...

  6. Earthquake Preparedness 101: Guidelines for Colleges and Universities.

    Science.gov (United States)

    California Governor's Office, Los Angeles. Office of Emergency Services.

    This document presents guidelines on emergency response and business recovery for colleges and universities in the event of an earthquake. The guidelines, developed by California institutions and revised based on experience with the Northridge earthquake, are provided under the following headings: (1) "To the President or Chancellor";…

  7. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM ~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  8. Stereo Pair, Pasadena, California

    Science.gov (United States)

    2000-01-01

    This stereoscopic image pair is a perspective view that shows the western part of the city of Pasadena, California, looking north toward the San Gabriel Mountains. Portions of the cities of Altadena and La Canada Flintridge are also shown. The cluster of large buildings left of center, at the base of the mountains, is the Jet Propulsion Laboratory. This image shows the power of combining data from different sources to create planning tools to study problems that affect large urban areas. In addition to the well-known earthquake hazards, Southern California is affected by a natural cycle of fire and mudflows. Data shown in this image can be used to predict both how wildfires spread over the terrain and how mudflows are channeled down the canyons.The image was created from three datasets: the Shuttle Radar Topography Mission (SRTM) supplied the elevation, U. S. Geological Survey digital aerial photography provided the image detail, and the Landsat Thematic Mapper provided the color. The United States Geological Survey's Earth Resources Observations Systems (EROS) Data Center, Sioux Falls, South Dakota, provided the Landsat data and the aerial photography. The image can be viewed in 3-D by viewing the left image with the right eye and the right image with the left eye (cross-eyed viewing), or by downloading and printing the image pair, and viewing them with a stereoscope.The Shuttle Radar Topography Mission (SRTM), launched on February 11, 2000, used the same radar instrument that comprised the Spaceborne Imaging Radar-C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) that flew twice on the Space Shuttle Endeavour in 1994. The mission was designed to collect three-dimensional measurements of the Earth's surface. To collect the 3-D data, engineers added a 60-meter-long (200-foot) mast, an additional C-band imaging antenna and improved tracking and navigation devices. The mission is a cooperative project between the National Aeronautics and Space Administration (NASA

  9. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  10. Crustal structure determined from ambient noise tomography near the magmatic centers of the Coso region, southeastern California

    Science.gov (United States)

    Yang, Yingjie; Ritzwoller, Michael H.; Jones, Craig H.

    2011-02-01

    We apply seismic ambient noise tomography to image and investigate the shallow shear velocity structure beneath the Coso geothermal field and surrounding areas. Data from a PASSCAL experiment operated within the Coso geothermal field between 1998 and 2000 and surrounding broadband stations from the Southern California Seismic Network are acquired and processed. Daily cross correlations of ambient noise between all pairs of stations that overlapped in time of deployment were calculated and then stacked over the duration of deployment. Phase velocities of Rayleigh waves between 3 and 10 s periods are measured from the resulting cross correlations. Depending on the period, between about 300 and 600 reliable phase velocity measurements are inverted for phase velocity maps from 3 to 10 s periods, which in turn are inverted for a 3-D shear velocity model beneath the region. The resulting 3-D model reveals features throughout the region that correlate with surface geology. Beneath the Coso geothermal area shear velocities are generally depressed, a prominent low-velocity anomaly is resolved clearly within the top 2 km, no significant anomaly is seen below about 14 km depth, and a weakly resolved anomaly is observed between 6 and 12 km depth. The anomaly in the top 2 km probably results from geothermal alteration in the shallow subsurface, no magmatic body is imaged beneath 14 km depth, but the shear velocity anomaly between 6 and 12 km may be attributable to partial melt. The thickness and amplitude of the magma body trade off in the inversion and are ill determined. Low velocities in the regions surrounding Coso at depths near 7 km underlie areas with Miocene to recent volcanism, suggesting that some magmatic processing of the crust could be focused near this depth.

  11. Radiologist compliance with California CT dose reporting requirements: a single-center review of pediatric chest CT.

    Science.gov (United States)

    Zucker, Evan J; Larson, David B; Newman, Beverley; Barth, Richard A

    2015-04-01

    Effective July 1, 2012, CT dose reporting became mandatory in California. We sought to assess radiologist compliance with this legislation and to determine areas for improvement. We retrospectively reviewed reports from all chest CT examinations performed at our institution from July 1, 2012, through June 30, 2013, for errors in documentation of volume CT dose index (CTDIvol), dose-length product (DLP), and phantom size. Reports were considered as legally compliant if both CTDIvol and DLP were documented accurately and as institutionally compliant if phantom size was also documented accurately. Additionally, we tracked reports that did not document dose in our standard format (phantom size, CTDIvol for each series, and total DLP). Radiologists omitted CTDIvol, DLP, or both in nine of 664 examinations (1.4%) and inaccurately reported one or both of them in 56 of the remaining 655 examinations (8.5%). Radiologists omitted phantom size in 11 of 664 examinations (1.7%) and inaccurately documented it in 20 of the remaining 653 examinations (3.1%). Of 664 examinations, 599 (90.2%) met legal reporting requirements, and 583 (87.8%) met institutional requirements. In reporting dose, radiologists variably used less decimal precision than available, summed CTDIvol, included only series-level DLP, and specified dose information from the scout topogram or a nonchest series for combination examinations. Our institutional processes, which primarily rely on correct human performance, do not ensure accurate dose reporting and are prone to variation in dose reporting format. In view of this finding, we are exploring higher-reliability processes, including better-defined standards and automated dose reporting systems, to improve compliance.

  12. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  13. The Geoscience Diversity Enhancement Program (GDEP): Building an Earth System Science Centered Research, Education, and Outreach Effort in Urban Long Beach, California

    Science.gov (United States)

    Ambos, E. L.; Behl, R.; Francis, R. D.; Larson, D. O.; Ramirez, M.; Rodrigue, C.; Sample, J.; Wechsler, S.; Whitney, D.; Hazen, C.

    2002-12-01

    The Geoscience Diversity Enhancement Program (GDEP) is an NSF-OEDG funded project at California State University, Long Beach (CSULB). Program goals include increasing awareness of geoscience careers, and the availability and accessibility of research experiences, to area high school and community college faculty and students from underrepresented groups. Begun in fall 2001, GDEP involves faculty leadership within three CSULB departments; geological sciences, geography, and anthropology, as well as five community colleges, and one of the largest K-12 school districts in California, Long Beach Unified. In addition, linkages to CSULB's outreach and student orientation activities are strong, with the facilitation of staff in CSULB's Student Access to Science and Mathematics (SAS) Center. During the first year, program activities centered around three major objectives: (1) creating the CSULB leadership team, and developing a robust and sustainable decision-making process, coupled with extensive relationship-building with community college and high school partners, (2) creating an evaluation plan that reflects institutional and leadership goals, and comprehensively piloting evaluation instruments, and, (3) designing and implementing a summer research experience, which was successfully inaugurated during summer 2002. We were very successful in achieving objective (1): each member of the leadership group took strong roles in the design and success of the program. Several meetings were held with each community college and high school faculty colleague, to clarify and reaffirm program values and goals. Objective (2), led by project evaluator David Whitney, resulted in an array of evaluation instruments that were tested in introductory geology, geography, and archaeology courses at CSULB. These evaluation instruments were designed to measure attitudes and beliefs of a diverse cross-section of CSULB students. Preliminary analysis of survey results reveals significant

  14. Addressing the Invisible Achievement Gap: The Need to Improve Education Outcomes for California Students in Foster Care, with Considerations for Action. CenterView: Insight and Analysis on California's Education Policy

    Science.gov (United States)

    Center for the Future of Teaching and Learning at WestEd, 2014

    2014-01-01

    Students who are in foster care--compared to all other student groups in California--drop out of school at much higher rates and graduate at much lower rates, with only about 58 percent of 12th-grade students earning a high school diploma. These and other findings on California students in foster care are documented in "The Invisible…

  15. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    Science.gov (United States)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  16. California Bioregions

    Data.gov (United States)

    California Department of Resources — California regions developed by the Inter-agency Natural Areas Coordinating Committee (INACC) were digitized from a 1:1,200,000 California Department of Fish and...

  17. California Bioregions

    Data.gov (United States)

    California Natural Resource Agency — California regions developed by the Inter-agency Natural Areas Coordinating Committee (INACC) were digitized from a 1:1,200,000 California Department of Fish and...

  18. Hazard-evaluation and technical-assistance report HETA 90-122-l2073, technical assistance to San Francisco General Hospital and Medical Center, San Francisco, California

    Energy Technology Data Exchange (ETDEWEB)

    Moss, C.E.; Seitz, T.

    1990-10-01

    In response to a request from the Director of the Environmental Health and Safety Department of the San Francisco General Hospital and Medical Center, located in San Francisco, California, an evaluation was undertaken of possible hazardous working conditions at that site. Concern existed about exposures to hazards while operating the germicidal lamp at the facility. Germicidal lamps were used to disinfect the air in tuberculosis and aerosolized pentamidine clinics. The workers wore no protective eye wear. All rooms used a 30 watt germicidal lamp. Lower wattage bulbs in the smaller rooms would have reduced occupational ultraviolet (UV) exposure. Reflectance levels of UV radiation were quite high and varied. Worker exposure to germicidal lamp UV levels was dependent on many factors, some of the most important ones being the position of the bulb in the room, age of the bulb, obstruction of the UV radiation by objects near the bulb, and the height of the worker. While there are no consensus guidelines available on ventilation systems designed for areas where germicidal lamps are used, the provision of good room air distribution and mixing is recommended to prevent stagnant air conditions or short circuiting of supply air within the room. Bulb changers need to be aware of the need for protective clothing and gloves for protection from both the UV radiation levels as well as possible glass breakage.

  19. Human-Centered Design as an Approach for Place-Based Innovation in Public Health: A Case Study from Oakland, California.

    Science.gov (United States)

    Vechakul, Jessica; Shrimali, Bina Patel; Sandhu, Jaspal S

    2015-12-01

    This case study provides a high-level overview of the human-centered design (HCD) or "design thinking" process and its relevance to public health. The Best Babies Zone (BBZ) initiative is a multi-year project aimed at reducing inequities in infant mortality rates. In 2012, BBZ launched pilot programs in three US cities: Cincinnati, Ohio; New Orleans, Louisiana; and Oakland, California. The Alameda County Public Health Department (ACPHD), the lead for the Oakland BBZ site, identified HCD as a promising approach for addressing the social and economic conditions that are important drivers of health inequities. HCD is a process for creating innovative products, services, and strategies that prioritizes the needs of the intended population. ACPHD partnered with the Gobee Group (a social innovation design consultancy) to develop the Design Sprint. The Design Sprint was a 12-week pilot in which 14 professionals from nine organizations used the HCD process to develop concepts for stimulating a vibrant local economy in the Oakland Best Babies Zone. Thirty- to sixty-minute semi-structured interviews were conducted with all 14 individuals involved in the Design Sprint. With the exception of one interview, the interviews were audio-recorded, transcribed, and inductively coded to identify themes. Our experience suggests that HCD can: enhance community engagement; expedite the timeframe for challenge identification, program design, and implementation; and create innovative programs that address complex challenges.

  20. Drilling and thermal gradient measurements at US Marine Corps Air Ground Combat Center, Twentynine Palms, California. Final report, October 1, 1983-March 31, 1984

    Energy Technology Data Exchange (ETDEWEB)

    Trexler, D.T.; Flynn, T.; Ghusn, G. Jr.

    1984-01-01

    Seven temperature gradient holes were drilled at the Marine Corps Air Ground Combat Center, Twentynine Palms, California, as part of a cooperative research and development program, jointly funded by the Navy and Department of Energy. The purpose of this program was to assess geothermal resources at selected Department of Defense installations. Drill site selection was based on geophysical anomalies delineated by combined gravity, ground magnetic and aeromagnetic surveys. Temperature gradients ranged from 1.3/sup 0/C/100 m (1/sup 0/F/100 ft.) in hole No. 1 to 15.3/sup 0/C/100 m (8.3/sup 0/F/100 ft.) in temperature gradient hole No. 6. Large, positive geothermal gradients in temperature gradient holes 5 and 6, combined with respective bottom hole temperatures of 51.6/sup 0/C (125/sup 0/F) and 67/sup 0/C (153/sup 0/F), indicate that an extensive, moderate-temperature geothermal resource is located on the MCAGCC. The geothermal reservoir appears to be situated in old, unconsolidated alluvial material and is structurally bounded on the east by the Mesquite Lake fault and on the west by the Surprise Spring fault. If measured temperature gradients continue to increase at the observed rate, temperatures in excess of 80/sup 0/C (178/sup 0/F) can be expected at a depth of 2000 feet.

  1. The rationale for and implementation of learner-centered education: experiences at the Ostrow School of Dentistry of the University of Southern California.

    Science.gov (United States)

    Navazesh, Mahvash; Rich, Sandra K; Tiber, Arnold

    2014-02-01

    This report describes the design, implementation, and function of integrated, learner-centered education at the Ostrow School of Dentistry of the University of Southern California. The 190 required courses of the previous curriculum have been condensed to forty-four courses. Four courses, presented for each of eleven trimesters of the four-year D.D.S. program, are entitled Human Structure, Human Function, Human Behavior, and Human Clinical Dentistry. An integrated biomedical sciences curriculum is supported by small-group, facilitator-based, problem-based learning (PBL) and an electronic PBL case library. Modules, rotations, and preclinical and clinical sessions make up remaining instructional units of the curriculum. Selected assessment outcomes measuring student knowledge, behavior, and skill development are discussed. As an external measure, first-attempt pass rates on the National Board Dental Examination (NBDE) Part I show a range of 87-96 percent over a ten-year period (for Classes 2005-14). First-attempt pass rates on the NBDE Part II for Classes 2005-12 ranged from 74 percent to 93 percent. Perceived barriers and opportunities for better performance on the NBDE Part II are addressed. Additionally, an exit survey, administered over the past four years, indicates a high level of student satisfaction with "depth and breadth" of their education (82-93 percent) and that graduates feel well prepared to enter the practice of dentistry (94-97 percent).

  2. Promising cancer treatment modality: the University of California Davis/McClellan Nuclear Radiation Center neutron capture therapy program

    Science.gov (United States)

    Autry-Conwell, Susan A.; Boggan, James E.; Edwards, Benjamin F.; Hou, Yongjin; Vincente, Maria-Graca; Liu, Hungyuan; Richards, Wade J.

    2000-12-01

    Neutron capture therapy (NCT) is a promising new binary therapeutic modality for the treatment of localized tumors. It is accomplished by injection and localization within the tumor of a neutron capture agent (NCA) that alone, is non- toxic. Whenthe tumor is then exposed to neutrons, a relatively non-toxic form of radiation, crytotoxic products are produced that directly or indirectly cause tumor cell death, and yet preserves normal surrounding tissue not contain the NCA. The UC Davis NCT program is currently working to develop and test new compounds or NCA in vitro and in vivo. Many groups worldwide are also working to develop the next generation NCA, but less than five facilities internationally are currently capable to treating clinical brain tumor patients by NCT and only two US facilities, MIT and Brookhaven National Laboratory. In addition to compound development, the UC Davis NCT program is preparing the UC Davis McClellan Nuclear Radiation Center's 2 megawatt TRIGA reactor for NCT clinical trials which would make it the only such facility on the West Coast.

  3. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  4. Stress Drops for Potentially Induced Earthquake Sequences

    Science.gov (United States)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  5. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS)

    Science.gov (United States)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center

    2010-12-01

    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  6. Examining the Use of the Cloud for Seismic Data Centers

    Science.gov (United States)

    Yu, E.; Meisenhelter, S.; Clayton, R. W.

    2011-12-01

    The Southern California Earthquake Data Center (SCEDC) archives seismic and station sensor metadata related to earthquake activity in southern California. It currently archives nearly 8400 data streams continuously from over 420 stations in near real time at a rate of 584 GB/month to a repository approximately 18 TB in size. Triggered waveform data from an average 12,000 earthquakes/year is also archived. Data are archived on mirrored disk arrays that are maintained and backed-up locally. These data are served over the Internet to scientists and the general public in many countries. The data demand has a steady component, largely needed for ambient noise correlation studies, and an impulsive component that is driven by earthquake activity. Designing a reliable, cost effective, system architecture equipped to handle periods of relatively low steady demand punctuated by unpredictable sharp spikes in demand immediately following a felt earthquake remains a major challenge. To explore an alternative paradigm, we have put one-month of the data in the "cloud" and have developed a user interface with the Google Apps Engine. The purpose is to assess the modifications in data structures that are necessary to make efficient searches. To date we have determined that the database schema must be "denormalized" to take advantage of the dynamic computational capabilities, and that it is likely advantageous to preprocess the waveform data to remove overlaps, gaps, and other artifacts. The final purpose of this study is to compare the cost of the cloud compared to ground-based centers. The major motivations for this study are the security and dynamic load capabilities of the cloud. In the cloud, multiple copies of the data are held in distributed centers thus eliminating the single point of failure associated with one center. The cloud can dynamically increase the level of computational resources during an earthquake, and the major tasks of managing a disk farm are eliminated. The

  7. Localized rejuvenation of a crystal mush recorded in zircon temporal and compositional variation at the Lassen Volcanic Center, northern California.

    Science.gov (United States)

    Klemetti, Erik W; Clynne, Michael A

    2014-01-01

    Zircon ages and trace element compositions from recent silicic eruptions in the Lassen Volcanic Center (LVC) allow for an evaluation of the timing and conditions of rejuvenation (reheating and mobilization of crystals) within the LVC magmatic system. The LVC is the southernmost active Cascade volcano and, prior to the 1980 eruption of Mount St. Helens, was the site of the only eruption in the Cascade arc during the last century. The three most recent silicic eruptions from the LVC were very small to moderate-sized lava flows and domes of dacite (1915 and 27 ka eruptions of Lassen Peak) and rhyodacite (1.1 ka eruption of Chaos Crags). These eruptions produced mixed and mingled lavas that contain a diverse crystal cargo, including zircon. 238U-230Th model ages from interior and surface analyses of zircon reveal ages from ∼17 ka to secular equilibrium (>350 ka), with most zircon crystallizing during a period between ∼60-200 ka. These data support a model for localized rejuvenation of crystal mush beneath the LVC. This crystal mush evidently is the remnant of magmatism that ended ∼190 ka. Most zircon are thought to have been captured from "cold storage" in the crystal mush (670-725°C, Hf >10,000 ppm, Eu/Eu* 0.25-0.4) locally remobilized by intrusion of mafic magma. A smaller population of zircon (>730°C, Hf 0.4) grew in, and are captured from, rejuvenation zones. These data suggest the dominant method to produce eruptible melt within the LVC is small-scale, local rejuvenation of the crystal mush accompanied by magma mixing and mingling. Based on zircon stability, the time required to heat, erupt and then cool to background conditions is relatively short, lasting a maximum of 10 s-1000 s years. Rejuvenation events in the LVC are ephemeral and permit eruption within an otherwise waning and cooling magmatic body.

  8. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  9. Coping with earthquakes induced by fluid injection

    Science.gov (United States)

    McGarr, Arthur F.; Bekins, Barbara; Burkardt, Nina; Dewey, James W.; Earle, Paul S.; Ellsworth, William L.; Ge, Shemin; Hickman, Stephen H.; Holland, Austin F.; Majer, Ernest; Rubinstein, Justin L.; Sheehan, Anne

    2015-01-01

    Large areas of the United States long considered geologically stable with little or no detected seismicity have recently become seismically active. The increase in earthquake activity began in the mid-continent starting in 2001 (1) and has continued to rise. In 2014, the rate of occurrence of earthquakes with magnitudes (M) of 3 and greater in Oklahoma exceeded that in California (see the figure). This elevated activity includes larger earthquakes, several with M > 5, that have caused significant damage (2, 3). To a large extent, the increasing rate of earthquakes in the mid-continent is due to fluid-injection activities used in modern energy production (1, 4, 5). We explore potential avenues for mitigating effects of induced seismicity. Although the United States is our focus here, Canada, China, the UK, and others confront similar problems associated with oil and gas production, whereas quakes induced by geothermal activities affect Switzerland, Germany, and others.

  10. Masonry infill performance during the Northridge earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Flanagan, R.D. [Lockheed Martin Energy Systems, Oak Ridge, TN (United States); Bennett, R.M.; Fischer, W.L. [Univ. of Tennesee, Knoxville, TN (United States); Adham, S.A. [Agbabian Associates, Pasadena, CA (United States)

    1996-03-08

    The response of masonry infills during the 1994 Northridge, California earthquake is described in terms of three categories: (1) lowrise and midrise structures experiencing large near field seismic excitations, (2) lowrise and midrise structures experiencing moderate far field excitation, and (3) highrise structures experiencing moderate far field excitation. In general, the infills provided a positive beneficial effect on the performance of the buildings, even those experiencing large peak accelerations near the epicenter. Varying types of masonry infills, structural frames, design conditions, and construction deficiencies were observed and their performance during the earthquake indicated. A summary of observations of the performance of infills in other recent earthquakes is given. Comparison with the Northridge earthquake is made and expected response of infill structures in lower seismic regions of the central and eastern United States is discussed.

  11. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  12. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  13. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  14. Darwin's earthquake.

    Science.gov (United States)

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  15. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  16. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  17. Contingent post-closure plan, hazardous waste management units at selected maintenance facilities, US Army National Training Center, Fort Irwin, California

    International Nuclear Information System (INIS)

    1992-01-01

    The National Training Center (NTC) at Fort Irwin, California, is a US Army training installation that provides tactical experience for battalion/task forces and squadrons in a mid- to high-intensity combat scenario. Through joint exercises with US Air Force and other services, the NTC also provides a data source for improvements of training doctrines, organization, and equipment. To meet the training and operational needs of the NTC, several maintenance facilities provide general and direct support for mechanical devices, equipment, and vehicles. Maintenance products used at these facilities include fuels, petroleum-based oils, lubricating grease, various degreasing solvents, antifreeze (ethylene glycol), transmission fluid, brake fluid, and hydraulic oil. Used or spent petroleum-based products generated at the maintenance facilities are temporarily accumulated in underground storage tanks (USTs), collected by the NTC hazardous waste management contractor (HAZCO), and stored at the Petroleum, Oil, and Lubricant (POL) Storage Facility, Building 630, until shipped off site to be recovered, reused, and/or reclaimed. Spent degreasing solvents and other hazardous wastes are containerized and stored on-base for up to 90 days at the NTC's Hazardous Waste Storage Facility, Building 703. The US Environmental Protection Agency (EPA) performed an inspection and reviewed the hazardous waste management operations of the NTC. Inspections indicated that the NTC had violated one or more requirements of Subtitle C of the Resource Conservation and Recovery Act (RCRA) and as a result of these violations was issued a Notice of Noncompliance, Notice of Necessity for Conference, and Proposed Compliance Schedule (NON) dated October 13, 1989. The following post-closure plan is the compliance-based approach for the NTC to respond to the regulatory violations cited in the NON

  18. Differential Energy Radiation from Two Earthquakes in Japan with Identical MW

    Science.gov (United States)

    Choy, G. L.; Boatwright, J.

    2007-12-01

    Teleseismic studies have found that in general the radiated energy ES of intraplate strike-slip earthquakes is elevated significantly for a given rupture size (as measured by moment) relative to the energies radiated by thrust earthquakes at subduction zones. We verify that this phenomenon can be observed for regional (217 K-net stations with a resulting ES of 1.3e15 J. The attenuation functions derived for these earthquakes are comparable to those derived from other studies of earthquakes in Japan as well as central California for f > 1.0 Hz. The ES of the strike-slip earthquake is more than three times larger than that of the thrust earthquake. Commensurate with the difference in energies, the macroseismic effects as reported for the strike-slip earthquake were more severe and widespread than effects reported for the thrust earthquake. However, from teleseismic data the energy of the strike-slip earthquake is a factor of 10 larger than that of the thrust earthquake. For the strike-slip earthquake, the acceleration spectra of teleseismic and regional analyses overlap smoothly. Good regional-teleseismic overlap has been seen in analyses of other strike-slip earthquakes such as the Hector Mines event. In contrast, for the Kyushu earthquake, the teleseismic analysis appeared to underestimate energy at a band of frequencies about 1.0 Hz relative to the regional data. A similar difference between regional and teleseismic spectra has been observed for the North Ridge and San Simeon, California thrust events.

  19. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  20. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  1. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  2. Oceanographic data acquired in support of the June 1980 study of the upwelling center off Point Sur, California (NODC Accession 9000024)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Current, Temperature and Depth (CTD) data were collected by Naval Postgraduate School from Coastal Waters of California between 2-19 June, 1980 using ship...

  3. Pre-earthquake burden of illness and postearthquake health and preparedness in veterans.

    Science.gov (United States)

    Der-Martirosian, Claudia; Riopelle, Deborah; Naranjo, Diana; Yano, Elizabeth M; Rubenstein, Lisa V; Dobalian, Aram

    2014-06-01

    During an earthquake, vulnerable populations, especially those with chronic conditions, are more susceptible to adverse, event-induced exacerbation of chronic conditions such as limited access to food and water, extreme weather temperatures, and injury. These circumstances merit special attention when health care facilities and organizations prepare for and respond to disasters. This study explores the relationship between pre-earthquake burden of illness and post earthquake health-related and preparedness factors in the US. Data from a cohort of male veterans who were receiving care at the Sepulveda Veterans Affairs Medical Center (VAMC) in Los Angeles, California USA during the 1994 Northridge earthquake were analyzed. Veterans with one or more chronic conditions were more likely to report pain lasting two or more days, severe mental/emotional stress for more than two weeks, broken/lost medical equipment, having difficulty refilling prescriptions, and being unable to get medical help following the quake compared to veterans without chronic conditions. In terms of personal emergency preparedness, however, there was no association between burden of illness and having enough food or water for at least 24 hours after the earthquake. The relationship that exists between health care providers, including both individual providers and organizations like the US Department of Veterans Affairs (VA), and their vulnerable, chronically-ill patients affords providers the unique opportunity to deliver critical assistance that could make this vulnerable population better prepared to meet their post disaster health-related needs. This can be accomplished through education about preparedness and the provision of easier access to medical supplies. Disaster plans for those who are burdened with chronic conditions should meet their social needs in addition to their psychological and physical needs.

  4. Benefits of Earthquake Early Warning to Large Municipalities (Invited)

    Science.gov (United States)

    Featherstone, J.

    2013-12-01

    The City of Los Angeles has been involved in the testing of the Cal Tech Shake Alert, Earthquake Early Warning (EQEW) system, since February 2012. This system accesses a network of seismic monitors installed throughout California. The system analyzes and processes seismic information, and transmits a warning (audible and visual) when an earthquake occurs. In late 2011, the City of Los Angeles Emergency Management Department (EMD) was approached by Cal Tech regarding EQEW, and immediately recognized the value of the system. Simultaneously, EMD was in the process of finalizing a report by a multi-discipline team that visited Japan in December 2011, which spoke to the effectiveness of EQEW for the March 11, 2011 earthquake that struck that country. Information collected by the team confirmed that the EQEW systems proved to be very effective in alerting the population of the impending earthquake. The EQEW in Japan is also tied to mechanical safeguards, such as the stopping of high-speed trains. For a city the size and complexity of Los Angeles, the implementation of a reliable EQEW system will save lives, reduce loss, ensure effective and rapid emergency response, and will greatly enhance the ability of the region to recovery from a damaging earthquake. The current Shake Alert system is being tested at several governmental organizations and private businesses in the region. EMD, in cooperation with Cal Tech, identified several locations internal to the City where the system would have an immediate benefit. These include the staff offices within EMD, the Los Angeles Police Department's Real Time Analysis and Critical Response Division (24 hour crime center), and the Los Angeles Fire Department's Metropolitan Fire Communications (911 Dispatch). All three of these agencies routinely manage the collaboration and coordination of citywide emergency information and response during times of crisis. Having these three key public safety offices connected and included in the

  5. Effects of catastrophic events on transportation system management and operations : Northridge earthquake -- January 17, 1994

    Science.gov (United States)

    2002-04-22

    This report documents the actions taken by transportation agencies in response to the earthquake in Northridge, California on January 17, 1994, and is part of a larger effort to examine the impacts of catastrophic events on transportation system faci...

  6. The Landers earthquake; preliminary instrumental results

    Science.gov (United States)

    Jones, L.; Mori, J.; Hauksson, E.

    1992-01-01

    Early on the morning of June 28, 1992, millions of people in southern California were awakened by the largest earthquake to occur in the western United States in the past 40 yrs. At 4:58 a.m PDT (local time), faulting associated with the magnitude 7.3 earthquake broke through to earth's surface near the town of Landers, California. the surface rupture then propagated 70km (45 mi) to the north and northwest along a band of faults passing through the middle of the Mojave Desert. Fortunately, the strongest shaking occurred in uninhabited regions of the Mojave Desert. Still one child was killed in Yucca Valley, and about 400 people were injured in the surrounding area. the desert communities of Landers, Yucca Valley, and Joshua Tree in San Bernardino Country suffered considerable damage to buildings and roads. Damage to water and power lines caused problems in many areas. 

  7. Liquefaction Hazard Maps for Three Earthquake Scenarios for the Communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos, Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale, Northern Santa Clara County, California

    Science.gov (United States)

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2008-01-01

    Maps showing the probability of surface manifestations of liquefaction in the northern Santa Clara Valley were prepared with liquefaction probability curves. The area includes the communities of San Jose, Campbell, Cupertino, Los Altos, Los Gatos Milpitas, Mountain View, Palo Alto, Santa Clara, Saratoga, and Sunnyvale. The probability curves were based on complementary cumulative frequency distributions of the liquefaction potential index (LPI) for surficial geologic units in the study area. LPI values were computed with extensive cone penetration test soundings. Maps were developed for three earthquake scenarios, an M7.8 on the San Andreas Fault comparable to the 1906 event, an M6.7 on the Hayward Fault comparable to the 1868 event, and an M6.9 on the Calaveras Fault. Ground motions were estimated with the Boore and Atkinson (2008) attenuation relation. Liquefaction is predicted for all three events in young Holocene levee deposits along the major creeks. Liquefaction probabilities are highest for the M7.8 earthquake, ranging from 0.33 to 0.37 if a 1.5-m deep water table is assumed, and 0.10 to 0.14 if a 5-m deep water table is assumed. Liquefaction probabilities of the other surficial geologic units are less than 0.05. Probabilities for the scenario earthquakes are generally consistent with observations during historical earthquakes.

  8. Lessons learned from the 1994 Northridge Earthquake

    International Nuclear Information System (INIS)

    Eli, M.W.; Sommer, S.C.

    1995-01-01

    Southern California has a history of major earthquakes and also has one of the largest metropolitan areas in the United States. The 1994 Northridge Earthquake challenged the industrial facilities and lifetime infrastructure in the northern Los Angeles (LA) area. Lawrence Livermore National Laboratory (LLNL) sent a team of engineers to conduct an earthquake damage investigation in the Northridge area, on a project funded jointly by the United States Nuclear Regulatory Commission (USNRC) and the United States Department of Energy (USDOE). Many of the structures, systems, and components (SSCs) and lifelines that suffered damage are similar to those found in nuclear power plants and in USDOE facilities. Lessons learned from these experiences can have some applicability at commercial nuclear power plants

  9. Earthquake experience suggests new approach to seismic criteria

    International Nuclear Information System (INIS)

    Knox, R.

    1983-01-01

    Progress in seismic qualification of nuclear power plants as reviewed at the 4th Pacific Basin Nuclear Conference in Vancouver, September 1983, is discussed. The lack of experience of earthquakes in existing nuclear plants can be compensated by the growing experience of actual earthquake effects in conventional power plants and similar installations. A survey of the effects on four power stations, with a total of twenty generating units, in the area strongly shaken by the San Fernando earthquake in California in 1971 is reported. The Canadian approach to seismic qualification, international criteria, Canadian/Korean experience, safety related equipment, the Tadotsu test facility and seismic tests are discussed. (U.K.)

  10. Electrical resistivity variations associated with earthquakes on the san andreas fault.

    Science.gov (United States)

    Mazzella, A; Morrison, H F

    1974-09-06

    A 24 percent precursory change in apparent electrical resistivity was observed before a magnitude 3.9 earthquake of strike-slip nature on the San Andreas fault in central California. The experimental configuration and numerical calculations suggest that the change is associated with a volume at depth rather than some near-surface phenomenon. The character and duration of the precursor period agree well with those of other earthquake studies and support a dilatant earthquake mechanism model.

  11. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  12. Earthquake fault superhighways

    Science.gov (United States)

    Robinson, D. P.; Das, S.; Searle, M. P.

    2010-10-01

    Motivated by the observation that the rare earthquakes which propagated for significant distances at supershear speeds occurred on very long straight segments of faults, we examine every known major active strike-slip fault system on land worldwide and identify those with long (> 100 km) straight portions capable not only of sustained supershear rupture speeds but having the potential to reach compressional wave speeds over significant distances, and call them "fault superhighways". The criteria used for identifying these are discussed. These superhighways include portions of the 1000 km long Red River fault in China and Vietnam passing through Hanoi, the 1050 km long San Andreas fault in California passing close to Los Angeles, Santa Barbara and San Francisco, the 1100 km long Chaman fault system in Pakistan north of Karachi, the 700 km long Sagaing fault connecting the first and second cities of Burma, Rangoon and Mandalay, the 1600 km Great Sumatra fault, and the 1000 km Dead Sea fault. Of the 11 faults so classified, nine are in Asia and two in North America, with seven located near areas of very dense populations. Based on the current population distribution within 50 km of each fault superhighway, we find that more than 60 million people today have increased seismic hazards due to them.

  13. Flood Management in California

    Directory of Open Access Journals (Sweden)

    Jay R. Lund

    2012-02-01

    Full Text Available California’s development and success have been shaped by its ability to manage floods. This management has varied over the history of California’s economic and political development and continues in various forms today. California will always have flood problems. A range of options are available to aid in flood management problems and have been used over time. These options can be contrasted with flood management elsewhere and the types of options used to manage other types of hazards in California, such as earthquakes, wildfires, and droughts. In the future, flood management in California will require greater reliance on local funding and leadership, reflecting diminished federal and state funding, with more effective state and federal guidance. Effective flood management will also tend to integrate flood management with actions to achieve environmental and other water supply objectives, both to gain revenues from a broader range of beneficiaries as well as to make more efficient use of land and water in a state where both are often scarce.

  14. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  15. Earthquake friction

    Science.gov (United States)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  16. Framework for Developing Economic Competitiveness Measures for the California Sustainable Freight Action Plan.

    Science.gov (United States)

    2017-07-04

    The METRANS Transportation Center has been providing technical assistance to the California Governors Office of Business and Economic Development (GO-Biz) and the California Air Resources Board (CARB) in support of implementing the California Sust...

  17. Earthquake Weather: Linking Seismicity to Changes in Barometric Pressure, Earth Tides, and Rainfall

    Science.gov (United States)

    West, J. D.; Garnero, E.; Shirzaei, M.

    2015-12-01

    It has been widely observed that earthquakes can be triggered due to changes in stress induced by the passage of surface waves from remote earthquakes. These stress changes are typically on the order of a few kiloPascals and occur over time spans of seconds. Less well investigated is the question of whether triggering of seismic activity can result from similar stress changes occurring over periods of hours or days due to changing barometric pressure, rainfall, and Earth tides. Past studies have shown a possible link between these stress sources and slow earthquakes in Taiwan (Hsu et al., 2015). Here, we investigate the relationship between seismicity and changing barometric pressure, Earth tides, and rainfall for four regions of the western United States where regional seismic networks provide high-quality seismic catalogs over relatively long time periods: Southern California, Northern California, the Pacific Northwest, and Utah. For each region we obtained seismic catalogs from January 2001 through September 2014 and acquired hourly data for barometric pressure and rainfall across the regions from the National Climatic Data Center. The vertical stress time series due to Earth tides was computed for the location of each weather station in the study areas. We summed the stresses from these 3 sources and looked for variations in seismicity correlated to the stress changes. We show that seismicity rates increase with increasing magnitude of stress change. In many cases the increase in seismicity is larger for reductions in vertical stress than it is for stress increases. We speculate that the dependency of seismic rate on combined vertical stress is acting through a combination of two mechanisms: (1) Reduced stresses reduce the normal force on faults, triggering failure in critically-stressed faults. (2) Increased stresses may similarly reduce the normal force on faults due to increases in pore pressure induced in the fault region.

  18. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  19. Image Recognition Techniques for Earthquake Early Warning

    Science.gov (United States)

    Boese, M.; Heaton, T. H.; Hauksson, E.

    2011-12-01

    When monitoring on his/her PC a map of seismic stations, whose colors scale with the real-time transmitted ground motions amplitudes observed in a dense seismic network, an experienced person will fairly easily recognize when and where an earthquake occurs. Using the maximum amplitudes at stations at close epicentral distances, he/she might even be able to roughly estimate the size of the event. From the number and distribution of stations turning 'red', the person might also be able to recognize the rupturing fault in a large earthquake (M>>7.0), and to estimate the rupture dimensions while the rupture is still developing. Following this concept, we are adopting techniques for automatic image recognition to provide earthquake early warning. We rapidly correlate a set of templates with real-time ground motion observations in a seismic network. If a 'suspicious' pattern of ground motion amplitudes is detected, the algorithm starts estimating the location of the earthquake and its magnitude. For large earthquakes the algorithm estimates finite source dimensions and the direction of rupture propagation. These predictions are continuously up-dated using the current 'image' of ground motion observations. A priori information, such as on the orientation of mayor faults, helps enhancing estimates in less dense networks. The approach will be demonstrated for multiple simulated and real events in California.

  20. Imaging Active Faults From Earthquakes in the San Gorgonio Pass - San Bernardino Mountains-San Jacinto Region, California and the Deep Continuity of the San Jacinto and San Andreas Faults

    Science.gov (United States)

    Yue, L.; Carena, S.; Suppe, J.; Kao, H.

    2001-12-01

    We imaged and mapped in 3-D over 50 active faults and fault segments using earthquake locations and focal mechanisms. The majority of these faults are previously unknown or unnamed. The 3-D fault maps better define the active structure of this complex region marked by profound uncertainties over the fundamental structural framework, including the subsurface continuity and geometry of the first-order San Andreas and San Jacinto faults, as well as the existence and role of major blind faults. We used the catalog of 43,500 relocated 1975-1998 earthquakes of Richards-Dinger and Shearer (2000), separating them into coplanar clusters associated with different faults and fault strands and fitting optimized surfaces to them. A clustering algorithm was applied to the relocated earthquakes in order to obtain tighter earthquake clouds and thus better-defined fault surfaces. We used the catalog of 13,000 focal mechanisms of Hauksson (2000) to confirm the nature of the mapped faults. Examples of our results are as follows: [1] The major San Jacinto strike-slip fault is offset by an east-dipping thrust fault near Anza at a depth of 11-15 km, and a similar fault geometry may exist near San Bernardino. [2] We do not see any seismic illumination of an active through-going San Andreas fault at depth in the San Gorgonio Pass area, but we can place several constraints on its possible location and geometry on the basis of the 3-D geometry and distribution of other faults. Between 5 and 20 km depth, this area is dominated by closely spaced faults trending SE-NW, which in map view occupy a triangle delimited by the Mission Creek fault to the N, the San Jacinto fault zone to the E, and the San Jacinto Mountains to the S. To the E, some of these faults terminate against an E-W trending fault. These faults do not show any sign of having been displaced by an intersecting fault. Some of the faults we imaged have a surface area comparable to the size of the rupture on the Northridge thrust

  1. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  2. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  3. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  4. California Air Basins

    Data.gov (United States)

    California Department of Resources — Air ResourcesCalifornia Air Resources BoardThe following datasets are from the California Air Resources Board: * arb_california_airbasins - California Air BasinsThe...

  5. Operational earthquake forecasting can enhance earthquake preparedness

    Science.gov (United States)

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  6. Perspectives on earthquake hazards in the New Madrid seismic zone, Missouri

    Science.gov (United States)

    Thenhaus, P.C.

    1990-01-01

    A sequence of three great earthquakes struck the Central United States during the winter of 1811-1812 in the area of New Madrid, Missouri. they are considered to be the greatest earthquakes in the conterminous U.S because they were felt and caused damage at far greater distances than any other earthquakes in U.S history. The large population currently living within the damage area of these earthquakes means that widespread destruction and loss of life is likely if the sequence were repeated. In contrast to California, where the earthquakes are felt frequently, the damaging earthquakes that have occurred in the Easter U.S-in 155 (Cape Ann, Mass.), 1811-12 (New Madrid, Mo.), 1886 (Charleston S.C) ,and 1897 (Giles County, Va.- are generally regarded as only historical phenomena (fig. 1). The social memory of these earthquakes no longer exists. A fundamental problem in the Eastern U.S, therefore, is that the earthquake hazard is not generally considered today in land-use and civic planning. This article offers perspectives on the earthquake hazard of the New Madrid seismic zone through discussions of the geology of the Mississippi Embayment, the historical earthquakes that have occurred there, the earthquake risk, and the "tools" that geoscientists have to study the region. The so-called earthquake hazard is defined  by the characterization of the physical attributes of the geological structures that cause earthquakes, the estimation of the recurrence times of the earthquakes, the estimation of the recurrence times of the earthquakes, their potential size, and the expected ground motions. the term "earthquake risk," on the other hand, refers to aspects of the expected damage to manmade strctures and to lifelines as a result of the earthquake hazard.  

  7. Least-Total-Cost Analysis for Earthquake Design Levels.

    Science.gov (United States)

    1980-06-01

    aid. I -n~..r d Idenify by blorA ... ber) Earthquakes, Structural design, Costs, Damae, Least cost, Optimal design Seismic risk. M0 ABSTRACT ( Continuo ...Oct 2-3, 1975, University of Illinois, Urbana, Ill. 16. University of California. EERC 75-27: Identification of research needs for improving aseismic

  8. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  9. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  10. Seismic resistance of equipment and building service systems: review of earthquake damage design requirements, and research applications in the USA

    International Nuclear Information System (INIS)

    Skjei, R.E.; Chakravartula, B.C.; Yanev, P.I.

    1979-01-01

    The history of earthquake damage and the resulting code design requirements for earthquake hazard mitigation for equipment in the USA is reviewed. Earthquake damage to essential service systems is summarized; observations for the 1964 Alaska and the 1971 San Fernando, California, earthquakes are stressed, and information from other events is included. USA building codes that reflect lessons learned from these earthquakes are discussed; brief summaries of widely used codes are presented. In conclusion there is a discussion of the desirability of adapting advanced technological concepts from the nuclear industry to equipment in conventional structures. (author)

  11. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  12. Evaluation and implementation of an improved methodology for earthquake ground response analysis : uniform treatment source, path and site effects.

    Science.gov (United States)

    2008-12-01

    Shortly after the 1994 Northridge Earthquake, Caltrans geotechnical engineers charged with developing site-specific : response spectra for high priority California bridges initiated a research project aimed at broadening their perspective : from simp...

  13. Coast of California Storm and Tidal Waves Study. Southern California Coastal Processes Data Summary,

    Science.gov (United States)

    1986-02-01

    southern California coastline and is now in the Gulf of California. The Gulf of California spreading center is joined to the Gorda spreading center (off...the influence of a semi-permanent high pressure ridge, the Sierra High (Douglas and Fritz, 1972). The ridge remained strong and overdeveloped for more...previous 800 years. Each of these droughts has been flanked by flood periods when the Sierra High was broken down allowing sub-tropical disturbances to

  14. CyberShake Physics-Based PSHA in Central California

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Goulet, C. A.; Milner, K. R.; Graves, R. W.; Olsen, K. B.; Jordan, T. H.

    2017-12-01

    The Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, which performs physics-based probabilistic seismic hazard analyis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a wavefield of Strain Green Tensors. An earthquake rupture forecast (ERF) is then extended by varying hypocenters and slips on finite faults, generating about 500,000 events per site of interest. Seismic reciprocity is used to calculate synthetic seismograms, which are processed to obtain intensity measures (IMs) such as RotD100. These are combined with ERF probabilities to produce hazard curves. PSHA results from hundreds of locations across a region are interpolated to produce a hazard map. CyberShake simulations with SCEC 3D Community Velocity Models have shown how the site and path effects vary with differences in upper crustal structure, and they are particularly informative about epistemic uncertainties in basin effects, which are not well parameterized by depths to iso-velocity surfaces, common inputs to GMPEs. In 2017, SCEC performed CyberShake Study 17.3, expanding into Central California for the first time. Seismic hazard calculations were performed at 1 Hz at 438 sites, using both a 3D tomographically-derived central California velocity model and a regionally averaged 1D model. Our simulation volumes extended outside of Central California, so we included other SCEC velocity models and developed a smoothing algorithm to minimize reflection and refraction effects along interfaces. CyberShake Study 17.3 ran for 31 days on NCSA's Blue Waters and ORNL's Titan supercomputers, burning 21.6 million core-hours and producing 285 million two-component seismograms and 43 billion IMs. These results demonstrate that CyberShake can be successfully expanded into new regions, and lend insights into the effects of directivity-basin coupling associated with basins near major faults such as the San Andreas. In

  15. Identifying and Addressing Genetic Counseling Challenges among Indigenous People of Oaxaca-One Center's Experience with Two Immigrant Farmworker Families in the Central Valley of California.

    Science.gov (United States)

    Shen, Joseph J; Carmichael, Jason; Vásquez Santos, Leoncio

    2018-02-03

    An important aspect of genetic counseling is the recognition of and adaptation to the socio-cultural uniqueness of the different populations that a genetics clinic serves. The Central Valley of California is home to a large population from Mexico, with a significant proportion of indigenous ancestry originating from the state of Oaxaca. We report on our experience with two families of this community-one extended family with an early lethal inborn error of metabolism and the other with a chronic disfiguring form of ichthyosis. We identified multiple important factors that needed to be considered, including the matching of language dialects, adaptation to different social interaction conventions, acknowledgement of traditional medicine beliefs, and effective transmission of genetic terms and concepts, all of which should be incorporated into the interactions with these families when aiming to provide comprehensive genetic counseling.

  16. Deeper penetration of large earthquakes on seismically quiescent faults.

    Science.gov (United States)

    Jiang, Junle; Lapusta, Nadia

    2016-06-10

    Why many major strike-slip faults known to have had large earthquakes are silent in the interseismic period is a long-standing enigma. One would expect small earthquakes to occur at least at the bottom of the seismogenic zone, where deeper aseismic deformation concentrates loading. We suggest that the absence of such concentrated microseismicity indicates deep rupture past the seismogenic zone in previous large earthquakes. We support this conclusion with numerical simulations of fault behavior and observations of recent major events. Our modeling implies that the 1857 Fort Tejon earthquake on the San Andreas Fault in Southern California penetrated below the seismogenic zone by at least 3 to 5 kilometers. Our findings suggest that such deeper ruptures may occur on other major fault segments, potentially increasing the associated seismic hazard. Copyright © 2016, American Association for the Advancement of Science.

  17. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    International Nuclear Information System (INIS)

    O'Brien, G.M.

    1993-01-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p number-sign 1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p number-sign 1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells

  18. Earthquake correlations and networks: A comparative study

    International Nuclear Information System (INIS)

    Krishna Mohan, T. R.; Revathi, P. G.

    2011-01-01

    We quantify the correlation between earthquakes and use the same to extract causally connected earthquake pairs. Our correlation metric is a variation on the one introduced by Baiesi and Paczuski [M. Baiesi and M. Paczuski, Phys. Rev. E 69, 066106 (2004)]. A network of earthquakes is then constructed from the time-ordered catalog and with links between the more correlated ones. A list of recurrences to each of the earthquakes is identified employing correlation thresholds to demarcate the most meaningful ones in each cluster. Data pertaining to three different seismic regions (viz., California, Japan, and the Himalayas) are comparatively analyzed using such a network model. The distribution of recurrence lengths and recurrence times are two of the key features analyzed to draw conclusions about the universal aspects of such a network model. We find that the unimodal feature of recurrence length distribution, which helps to associate typical rupture lengths with different magnitude earthquakes, is robust across the different seismic regions. The out-degree of the networks shows a hub structure rooted on the large magnitude earthquakes. In-degree distribution is seen to be dependent on the density of events in the neighborhood. Power laws, with two regimes having different exponents, are obtained with recurrence time distribution. The first regime confirms the Omori law for aftershocks while the second regime, with a faster falloff for the larger recurrence times, establishes that pure spatial recurrences also follow a power-law distribution. The crossover to the second power-law regime can be taken to be signaling the end of the aftershock regime in an objective fashion.

  19. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    Science.gov (United States)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  20. Geologic investigations of Australian earthquakes: Paleoseismicity and the recurrence of surface faulting in the stable regions of continents

    Science.gov (United States)

    Machette, Michael; Crone, Anthony

    1993-01-01

    Earthquakes that occur in the stable regions of continents are very rare compared to those that occur along plate margins, such as the San Andreas fault system of western California. Worldwide, only 11 historic earthquakes in stable continental regions are known to have produced surface ruptures. Five of these have occurred in Australia since 1968 (see map, next page).

  1. Earthquake research in the Soviet Union

    Science.gov (United States)

    Spall, H.

    1979-01-01

    Henry Spall talked recently with Robert L. Wesson, the new Chief, Office of Earthquake Studies at the U.S Geological Survey National Center, Reston, Va. Wesson has spent altogether almost 1 year in the U.S.S.R, and 6 months of that time in the Garm area of Soviet Tadzhikistan in 1974. 

  2. Learning from physics-based earthquake simulators: a minimal approach

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  3. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    Science.gov (United States)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  4. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  5. Preparing for a "Big One": The great southern California shakeout

    Science.gov (United States)

    Jones, L.M.; Benthien, M.

    2011-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million Southern Californians pretended that the magnitude-7.8 ShakeOut scenario earthquake was occurring and practiced actions derived from results of the ShakeOut Scenario, to reduce the impact of a real, San Andreas Fault event. The communications campaign was based on four principles: 1) consistent messaging from multiple sources; 2) visual reinforcement: 3) encouragement of "milling"; and 4) focus on concrete actions. The goals of the Shake-Out established in Spring 2008 were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in Southern California; and 3) to reduce earthquake losses in Southern California. Over 90% of the registrants surveyed the next year reported improvement in earthquake preparedness at their organization as a result of the ShakeOut. ?? 2011, Earthquake Engineering Research Institute.

  6. Learning from Earthquakes: 2014 Napa Valley Earthquake Reconnaissance Report

    OpenAIRE

    Fischer, Erica

    2014-01-01

    Structural damage was observed during reconnaissance after the 2014 South Napa Earthquake, and included damage to wine storage and fermentation tanks, collapse of wine storage barrel racks, unreinforced masonry building partial or full collapse, and residential building damage. This type of damage is not unique to the South Napa Earthquake, and was observed after other earthquakes such as the 1977 San Juan Earthquake, and the 2010 Maule Earthquake. Previous research and earthquakes have demon...

  7. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  8. Recovering from the ShakeOut earthquake

    Science.gov (United States)

    Wein, Anne; Johnson, Laurie; Bernknopf, Richard

    2011-01-01

    Recovery from an earthquake like the M7.8 ShakeOut Scenario will be a major endeavor taking many years to complete. Hundreds of Southern California municipalities will be affected; most lack recovery plans or previous disaster experience. To support recovery planning this paper 1) extends the regional ShakeOut Scenario analysis into the recovery period using a recovery model, 2) localizes analyses to identify longer-term impacts and issues in two communities, and 3) considers the regional context of local recovery.Key community insights about preparing for post-disaster recovery include the need to: geographically diversify city procurement; set earthquake mitigation priorities for critical infrastructure (e.g., airport), plan to replace mobile homes with earthquake safety measures, consider post-earthquake redevelopment opportunities ahead of time, and develop post-disaster recovery management and governance structures. This work also showed that communities with minor damages are still sensitive to regional infrastructure damages and their potential long-term impacts on community recovery. This highlights the importance of community and infrastructure resilience strategies as well.

  9. Center for Multiscale The Regents of the University of California, Los Angeles Plasma Dynamics: Report on Activities (UCLA/MIT), 2009-2010

    International Nuclear Information System (INIS)

    Carter, Troy

    2011-01-01

    The final 'phaseout' year of the CMPD ended July 2010; a no cost extension was requested until May 2011 in order to enable the MIT subcontract funds to be fully utilized. Research progress over this time included verification and validation activities for the BOUT and BOUT++ code, studies of spontaneous reconnection in the VTF facility at MIT, and studies of the interaction between Alfven waves and drift waves in LAPD. The CMPD also hosted the 6th plasma physics winter school in 2010 (jointly with the NSF frontier center the Center for Magnetic Self-Organization, significant funding came from NSF for this most recent iteration of the Winter School).

  10. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  11. Determining on-fault earthquake magnitude distributions from integer programming

    Science.gov (United States)

    Geist, Eric L.; Parsons, Tom

    2018-02-01

    Earthquake magnitude distributions among faults within a fault system are determined from regional seismicity and fault slip rates using binary integer programming. A synthetic earthquake catalog (i.e., list of randomly sampled magnitudes) that spans millennia is first formed, assuming that regional seismicity follows a Gutenberg-Richter relation. Each earthquake in the synthetic catalog can occur on any fault and at any location. The objective is to minimize misfits in the target slip rate for each fault, where slip for each earthquake is scaled from its magnitude. The decision vector consists of binary variables indicating which locations are optimal among all possibilities. Uncertainty estimates in fault slip rates provide explicit upper and lower bounding constraints to the problem. An implicit constraint is that an earthquake can only be located on a fault if it is long enough to contain that earthquake. A general mixed-integer programming solver, consisting of a number of different algorithms, is used to determine the optimal decision vector. A case study is presented for the State of California, where a 4 kyr synthetic earthquake catalog is created and faults with slip ≥3 mm/yr are considered, resulting in >106 variables. The optimal magnitude distributions for each of the faults in the system span a rich diversity of shapes, ranging from characteristic to power-law distributions.

  12. Simulation of rockfalls triggered by earthquakes

    Science.gov (United States)

    Kobayashi, Y.; Harp, E.L.; Kagawa, T.

    1990-01-01

    A computer program to simulate the downslope movement of boulders in rolling or bouncing modes has been developed and applied to actual rockfalls triggered by the Mammoth Lakes, California, earthquake sequence in 1980 and the Central Idaho earthquake in 1983. In order to reproduce a movement mode where bouncing predominated, we introduced an artificial unevenness to the slope surface by adding a small random number to the interpolated value of the mid-points between the adjacent surveyed points. Three hundred simulations were computed for each site by changing the random number series, which determined distances and bouncing intervals. The movement of the boulders was, in general, rather erratic depending on the random numbers employed, and the results could not be seen as deterministic but stochastic. The closest agreement between calculated and actual movements was obtained at the site with the most detailed and accurate topographic measurements. ?? 1990 Springer-Verlag.

  13. Magmatic unrest beneath Mammoth Mountain, California

    Science.gov (United States)

    Hill, David P.; Prejean, Stephanie

    2005-09-01

    Mammoth Mountain, which stands on the southwest rim of Long Valley caldera in eastern California, last erupted ˜57,000 years BP. Episodic volcanic unrest detected beneath the mountain since late 1979, however, emphasizes that the underlying volcanic system is still active and capable of producing future volcanic eruptions. The unrest symptoms include swarms of small ( M ≤ 3) earthquakes, spasmodic bursts (rapid-fire sequences of brittle-failure earthquakes with overlapping coda), long-period (LP) and very-long-period (VLP) volcanic earthquakes, ground deformation, diffuse emission of magmatic CO 2, and fumarole gases with elevated 3He/ 4He ratios. Spatial-temporal relations defined by the multi-parameter monitoring data together with earthquake source mechanisms suggest that this Mammoth Mountain unrest is driven by the episodic release of a volume of CO 2-rich hydrous magmatic fluid derived from the upper reaches of a plexus of basaltic dikes and sills at mid-crustal depths (10-20 km). As the mobilized fluid ascends through the brittle-plastic transition zone and into overlying brittle crust, it triggers earthquake swarm activity and, in the case of the prolonged, 11-month-long earthquake swarm of 1989, crustal deformation and the onset of diffuse CO 2 emissions. Future volcanic activity from this system would most likely involve steam explosions or small-volume, basaltic, strombolian or Hawaiaan style eruptions. The impact of such an event would depend critically on vent location and season.

  14. African American Students in a California Community College: Perceptions of Cultural Congruity and Academic Self-Concept within a Black Culture Center

    Science.gov (United States)

    James, Tenisha Celita

    2017-01-01

    This study focused on the cultural congruity and academic self-concept of African American students in a community college setting who participated in a Black Culture Center. The purpose of this quantitative correlational study was to examine the relationship between cultural congruity and academic self-concept through the following two research…

  15. Slip in the 1857 and earlier large earthquakes along the Carrizo Plain, San Andreas Fault.

    Science.gov (United States)

    Zielke, Olaf; Arrowsmith, J Ramón; Grant Ludwig, Lisa; Akçiz, Sinan O

    2010-02-26

    The moment magnitude (Mw) 7.9 Fort Tejon earthquake of 1857, with a approximately 350-kilometer-long surface rupture, was the most recent major earthquake along the south-central San Andreas Fault, California. Based on previous measurements of its surface slip distribution, rupture along the approximately 60-kilometer-long Carrizo segment was thought to control the recurrence of 1857-like earthquakes. New high-resolution topographic data show that the average slip along the Carrizo segment during the 1857 event was 5.3 +/- 1.4 meters, eliminating the core assumption for a linkage between Carrizo segment rupture and recurrence of major earthquakes along the south-central San Andreas Fault. Earthquake slip along the Carrizo segment may recur in earthquake clusters with cumulative slip of approximately 5 meters.

  16. Training Neural Networks Based on Imperialist Competitive Algorithm for Predicting Earthquake Intensity

    OpenAIRE

    Moradi, Mohsen

    2017-01-01

    In this study we determined neural network weights and biases by Imperialist Competitive Algorithm (ICA) in order to train network for predicting earthquake intensity in Richter. For this reason, we used dependent parameters like earthquake occurrence time, epicenter's latitude and longitude in degree, focal depth in kilometer, and the seismological center distance from epicenter and earthquake focal center in kilometer which has been provided by Berkeley data base. The studied neural network...

  17. Seismic Intensity, PGA and PGV for the South Napa Earthquake, August 24, 2014

    Science.gov (United States)

    Chen, S.; Pickering, A.; Mooney, W. D.; Crewdson, E.

    2014-12-01

    Numerous studies have investigated the statistical relationship between Modified Mercalli Intensity (MMI) and peak ground acceleration (PGA) and peak ground velocity (PGV). The Mw 6.0 South Napa (California) earthquake of August 24, 2014 provides valuable data to examine these relationships for both urban and rural environments within northern California. The finite fault model (D. Dreger, 2014) indicates that the fault rupture propagated predominantly NNW and up-dip from a 11-km-deep hypocenter. The epicentral location was 8 km SSW of downtown Napa. Recorded PGA north of the epicenter is as high as 600 cm/s2 and PGV locally reaches 80 cm/s. Field studies by two teams of investigators were conducted on August 24-26 and September 8-11, 2014 to assign MMI values at 108 sites. The highest MMI values are strongly localized along the NNW-SSE rupture trend north of the epicenter. Parts of the city of Napa and some communities several km farther north on Dry Creek Road were found to have experienced shaking intensities of MMI VII to VIII. The observed effects include houses moved off their foundations, chimney collapse or damage, cracked foundations and/or interior walls, broken windows, and the lateral displacement of heavy furniture. Communities to the east and west of this zone of high seismic intensity reported significantly lower values of MMI, typically IV and V even at distances as close as 10 km from the mapped surface rupture. In comparison with previous estimates by Wald et al. (1999) and Dangkua and Cramer (2011), we find that MMI III-VIII uniformly correspond to significantly larger (>150%) PGA and PGV values, as reported by the Center for Engineering Strong Motion Data (2014). We attribute this observation to two primary factors: (1) improved earthquake engineering in the post-Loma Prieta earthquake era that has led to building construction, both new and retrofitted, that is more resistant to earthquake strong ground motions; and (2) a frequency band relevant

  18. Initial impressions from the Northern California 2008 lightning siege: A report by a Wildland Fire Lessons Learned Center Information Collection Team

    Science.gov (United States)

    Jonetta T. Holt; David Christenson; Anne Black; Brett Fay; Kim Round

    2009-01-01

    This event in NorCal is another of the major events we have experienced in fire management. In line with our desire to learn, we ought to line up a team to help us capture lessons learned from this event." This statement, and a regional delegation, was the impetus for an information collection team from the Wildland Fire Lessons Learned Center to visit with...

  19. California's restless giant: the Long Valley Caldera

    Science.gov (United States)

    Hill, David P.; Bailey, Roy A.; Hendley, James W.; Stauffer, Peter H.; Marcaida, Mae

    2014-01-01

    Scientists have monitored geologic unrest in the Long Valley, California, area since 1980. In that year, following a swarm of strong earthquakes, they discovered that the central part of the Long Valley Caldera had begun actively rising. Unrest in the area persists today. The U.S. Geological Survey (USGS) continues to provide the public and civil authorities with current information on the volcanic hazard at Long Valley and is prepared to give timely warnings of any impending eruption.

  20. Seasonal water storage modulating seismicity on California faults

    Science.gov (United States)

    Johnson, C. W.; Fu, Y.; Burgmann, R.

    2016-12-01

    In California the accumulation of winter snowpack in the Sierra Nevada, surface water in lakes and reservoirs, and groundwater in sedimentary basins follow the annual cycle of wet winters and dry summers. The surface loads resulting from the seasonal changes in water storage produce elastic deformation of the Earth's crust. Micro-earthquakes in California appear to follow a subtle annual cycle, possibly in response to the water load. Previous studies posit that temperature, atmospheric pressure, or hydrologic changes may strain the lithosphere and promote additional earthquakes above background levels. Here we use GPS vertical time series (2006 - 2015) to constrain models of monthly hydrospheric loading and compute annual peak-to-peak stresses on faults throughout northern California, which can exceed 1kPa. Depending on fault geometry the addition or removal of water increases the Coulomb failure stress. The largest stress amplitudes are occurring on dipping reverse faults in the Coast Ranges and along the eastern Sierra Nevada range front. We analyze M≥2.0 earthquakes with known focal mechanisms in northern and central California to resolve fault normal and shear stresses for the focal geometry. Our results reveal more earthquakes occurring during slip-encouraging stress conditions and suggest that earthquake populations are modulated at periods of natural loading cycles, which promote failure by subtle stress changes. The most notable shear-stress change occurs on more shallowly dipping structures. However, vertically dipping strike-slip faults are common throughout California and experience smaller amplitude stress change but still exhibit positive correlation with seasonal loading cycles. Our seismicity analysis suggests the annual hydrologic cycle is a viable mechanism to promote earthquakes and provides new insight to fault mechanical properties.

  1. A fault and seismicity based composite simulation in northern California

    Directory of Open Access Journals (Sweden)

    M. B. Yıkılmaz

    2011-12-01

    Full Text Available We generate synthetic catalogs of seismicity in northern California using a composite simulation. The basis of the simulation is the fault based "Virtual California" (VC earthquake simulator. Back-slip velocities and mean recurrence intervals are specified on model strike-slip faults. A catalog of characteristic earthquakes is generated for a period of 100 000 yr. These earthquakes are predominantly in the range M = 6 to M = 8, but do not follow Gutenberg-Richter (GR scaling at lower magnitudes. In order to model seismicity on unmapped faults we introduce background seismicity which occurs randomly in time with GR scaling and is spatially associated with the VC model faults. These earthquakes fill in the GR scaling down to M = 4 (the smallest earthquakes modeled. The rate of background seismicity is constrained by the observed rate of occurrence of M > 4 earthquakes in northern California. These earthquakes are then used to drive the BASS (branching aftershock sequence model of aftershock occurrence. The BASS model is the self-similar limit of the ETAS (epidemic type aftershock sequence model. Families of aftershocks are generated following each Virtual California and background main shock. In the simulations the rate of occurrence of aftershocks is essentially equal to the rate of occurrence of main shocks in the magnitude range 4 < M < 7. We generate frequency-magnitude and recurrence interval statistics both regionally and fault specific. We compare our modeled rates of seismicity and spatial variability with observations.

  2. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  3. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    Department of Earth. Sciences, University of. Roorkee. Her interest is in computer based solutions to geophysical and other earth science problems. If we adopt the definition that an earthquake is shaking of the earth due to natural causes, then we may argue that earthquakes have been occurring since the very beginning.

  4. Bam Earthquake in Iran

    CERN Document Server

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  5. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  6. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    there are few estimates about this earthquake as it probably occurred in that early period of the earth's history about which astronomers, physicists, chemists and earth scientists are still sorting out their ideas. Yet, the notion of the earliest earthquake excites interest. We explore this theme here partly also because.

  7. Analysis of the spatial distribution between successive earthquakes

    International Nuclear Information System (INIS)

    Davidsen, Joern; Paczuski, Maya

    2005-01-01

    Spatial distances between subsequent earthquakes in southern California exhibit scale-free statistics, with a critical exponent δ≅0.6, as well as finite size scaling. The statistics are independent of the threshold magnitude as long as the catalog is complete, but depend strongly on the temporal ordering of events, rather than the geometry of the spatial epicenter distribution. Nevertheless, the spatial distance and waiting time between subsequent earthquakes are uncorrelated with each other. These observations contradict the theory of aftershock zone scaling with main shock magnitude

  8. Shallow Crustal Structure in the Northern Salton Trough, California: Insights from a Detailed 3-D Velocity Model

    Science.gov (United States)

    Ajala, R.; Persaud, P.; Stock, J. M.; Fuis, G. S.; Hole, J. A.; Goldman, M.; Scheirer, D. S.

    2017-12-01

    The Coachella Valley is the northern extent of the Gulf of California-Salton Trough. It contains the southernmost segment of the San Andreas Fault (SAF) for which a magnitude 7.8 earthquake rupture was modeled to help produce earthquake planning scenarios. However, discrepancies in ground motion and travel-time estimates from the current Southern California Earthquake Center (SCEC) velocity model of the Salton Trough highlight inaccuracies in its shallow velocity structure. An improved 3-D velocity model that better defines the shallow basin structure and enables the more accurate location of earthquakes and identification of faults is therefore essential for seismic hazard studies in this area. We used recordings of 126 explosive shots from the 2011 Salton Seismic Imaging Project (SSIP) to SSIP receivers and Southern California Seismic Network (SCSN) stations. A set of 48,105 P-wave travel time picks constituted the highest-quality input to a 3-D tomographic velocity inversion. To improve the ray coverage, we added network-determined first arrivals at SCSN stations from 39,998 recently relocated local earthquakes, selected to a maximum focal depth of 10 km, to develop a detailed 3-D P-wave velocity model for the Coachella Valley with 1-km grid spacing. Our velocity model shows good resolution ( 50 rays/cubic km) down to a minimum depth of 7 km. Depth slices from the velocity model reveal several interesting features. At shallow depths ( 3 km), we observe an elongated trough of low velocity, attributed to sediments, located subparallel to and a few km SW of the SAF, and a general velocity structure that mimics the surface geology of the area. The persistence of the low-velocity sediments to 5-km depth just north of the Salton Sea suggests that the underlying basement surface, shallower to the NW, dips SE, consistent with interpretation from gravity studies (Langenheim et al., 2005). On the western side of the Coachella Valley, we detect depth-restricted regions of

  9. Using a geographic information system (GIS) to assess pediatric surge potential after an earthquake.

    Science.gov (United States)

    Curtis, Jacqueline W; Curtis, Andrew; Upperman, Jeffrey S

    2012-06-01

    Geographic information systems (GIS) and geospatial technology (GT) can help hospitals improve plans for postdisaster surge by assessing numbers of potential patients in a catchment area and providing estimates of special needs populations, such as pediatrics. In this study, census-derived variables are computed for blockgroups within a 3-mile radius from Children's Hospital Los Angeles (CHLA) and from Los Angeles County-University of Southern California (LAC-USC) Medical Center. Landslide and liquefaction zones are overlaid on US Census Bureau blockgroups. Units that intersect with the hazard zones are selected for computation of pediatric surge potential in case of an earthquake. In addition, cartographic visualization and cluster analysis are performed on the entire 3-mile study area to identify hot spots of socially vulnerable populations. The results suggest the need for locally specified vulnerability models for pediatric populations. GIS and GT have untapped potential to contribute local specificity to planning for surge potential after a disaster. Although this case focuses on an earthquake hazard, the methodology is appropriate for an all-hazards approach. With the advent of Google Earth, GIS output can now be easily shared with medical personnel for broader application and improvement in planning.

  10. California Ocean Uses Atlas: Fishing sector

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  11. Coccidioidomycosis among Prison Inmates, California, USA, 2011

    Centers for Disease Control (CDC) Podcasts

    2015-02-26

    Dr. Charlotte Wheeler discusses Coccidioidomycosis among Prison Inmates in California.  Created: 2/26/2015 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 2/26/2015.

  12. California Ocean Uses Atlas: Industrial sector

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset is a result of the California Ocean Uses Atlas Project: a collaboration between NOAA's National Marine Protected Areas Center and Marine Conservation...

  13. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  14. CyberShake: A Physics-Based Seismic Hazard Model for Southern California

    Science.gov (United States)

    Graves, R.; Jordan, T.H.; Callaghan, S.; Deelman, E.; Field, E.; Juve, G.; Kesselman, C.; Maechling, P.; Mehta, G.; Milner, K.; Okaya, D.; Small, P.; Vahi, K.

    2011-01-01

    CyberShake, as part of the Southern California Earthquake Center's (SCEC) Community Modeling Environment, is developing a methodology that explicitly incorporates deterministic source and wave propagation effects within seismic hazard calculations through the use of physics-based 3D ground motion simulations. To calculate a waveform-based seismic hazard estimate for a site of interest, we begin with Uniform California Earthquake Rupture Forecast, Version 2.0 (UCERF2.0) and identify all ruptures within 200 km of the site of interest. We convert the UCERF2.0 rupture definition into multiple rupture variations with differing hypocenter locations and slip distributions, resulting in about 415,000 rupture variations per site. Strain Green Tensors are calculated for the site of interest using the SCEC Community Velocity Model, Version 4 (CVM4), and then, using reciprocity, we calculate synthetic seismograms for each rupture variation. Peak intensity measures are then extracted from these synthetics and combined with the original rupture probabilities to produce probabilistic seismic hazard curves for the site. Being explicitly site-based, CyberShake directly samples the ground motion variability at that site over many earthquake cycles (i. e., rupture scenarios) and alleviates the need for the ergodic assumption that is implicitly included in traditional empirically based calculations. Thus far, we have simulated ruptures at over 200 sites in the Los Angeles region for ground shaking periods of 2 s and longer, providing the basis for the first generation CyberShake hazard maps. Our results indicate that the combination of rupture directivity and basin response effects can lead to an increase in the hazard level for some sites, relative to that given by a conventional Ground Motion Prediction Equation (GMPE). Additionally, and perhaps more importantly, we find that the physics-based hazard results are much more sensitive to the assumed magnitude-area relations and

  15. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    Science.gov (United States)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  16. The SMART CLUSTER METHOD - adaptive earthquake cluster analysis and declustering

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2016-04-01

    Earthquake declustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity with usual applications comprising of probabilistic seismic hazard assessments (PSHAs) and earthquake prediction methods. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation. Various methods have been developed to address this issue from other researchers. These have differing ranges of complexity ranging from rather simple statistical window methods to complex epidemic models. This study introduces the smart cluster method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal identification. Hereby, an adaptive search algorithm for data point clusters is adopted. It uses the earthquake density in the spatio-temporal neighbourhood of each event to adjust the search properties. The identified clusters are subsequently analysed to determine directional anisotropy, focussing on a strong correlation along the rupture plane and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010/2011 Darfield-Christchurch events, an adaptive classification procedure is applied to disassemble subsequent ruptures which may have been grouped into an individual cluster using near-field searches, support vector machines and temporal splitting. The steering parameters of the search behaviour are linked to local earthquake properties like magnitude of completeness, earthquake density and Gutenberg-Richter parameters. The method is capable of identifying and classifying earthquake clusters in space and time. It is tested and validated using earthquake data from California and New Zealand. As a result of the cluster identification process, each event in

  17. The ShakeOut earthquake scenario: Verification of three simulation sets

    Science.gov (United States)

    Bielak, J.; Graves, R.W.; Olsen, K.B.; Taborda, R.; Ramirez-Guzman, L.; Day, S.M.; Ely, G.P.; Roten, D.; Jordan, T.H.; Maechling, P.J.; Urbanic, J.; Cui, Y.; Juve, G.

    2010-01-01

    This paper presents a verification of three simulations of the ShakeOut scenario, an Mw 7.8 earthquake on a portion of the San Andreas fault in southern California, conducted by three different groups at the Southern California Earthquake Center using the SCEC Community Velocity Model for this region. We conducted two simulations using the finite difference method, and one by the finite element method, and performed qualitative and quantitative comparisons between the corresponding results. The results are in good agreement with each other; only small differences occur both in amplitude and phase between the various synthetics at ten observation points located near and away from the fault-as far as 150 km away from the fault. Using an available goodness-of-fit criterion all the comparisons scored above 8, with most above 9.2. This score would be regarded as excellent if the measurements were between recorded and synthetic seismograms. We also report results of comparisons based on time-frequency misfit criteria. Results from these two criteria can be used for calibrating the two methods for comparing seismograms. In those cases in which noticeable discrepancies occurred between the seismograms generated by the three groups, we found that they were the product of inherent characteristics of the various numerical methods used and their implementations. In particular, we found that the major source of discrepancy lies in the difference between mesh and grid representations of the same material model. Overall, however, even the largest differences in the synthetic seismograms are small. Thus, given the complexity of the simulations used in this verification, it appears that the three schemes are consistent, reliable and sufficiently accurate and robust for use in future large-scale simulations. ?? 2009 The Authors Journal compilation ?? 2009 RAS.

  18. Earthquakes and faults in the San Francisco Bay area (1970-2003)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.; Wong, Florence L.; Saucedo, George J.

    2004-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.0 in the greater San Francisco Bay area. Twenty-two earthquakes magnitude 5.0 and greater are indicated on the map and listed chronologically in an accompanying table. The data are compiled from records from 1970-2003. The bathymetry was generated from a digital version of NOAA maps and hydrogeographic data for San Francisco Bay. Elevation data are from the USGS National Elevation Database. Landsat satellite image is from seven Landsat 7 Enhanced Thematic Mapper Plus scenes. Fault data are reproduced with permission from the California Geological Survey. The earthquake data are from the Northern California Earthquake Catalog.

  19. Spatiotermporal correlations of earthquakes

    International Nuclear Information System (INIS)

    Farkas, J.; Kun, F.

    2007-01-01

    Complete text of publication follows. An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves. At the present technological level, earthquakes of magnitude larger than three can be recorded all over the world. In spite of the apparent randomness of earthquake occurrence, long term measurements have revealed interesting scaling laws of earthquake characteristics: the rate of aftershocks following major earthquakes has a power law decay (Omori law); the magnitude distribution of earthquakes exhibits a power law behavior (Gutenberg-Richter law), furthermore, it has recently been pointed out that epicenters form fractal networks in fault zones (Kagan law). The theoretical explanation of earthquakes is based on plate tectonics: the earth's crust has been broken into plates which slowly move under the action of the flowing magma. Neighboring plates touch each other along ridges (fault zones) where a large amount of energy is stored in deformation. Earthquakes occur when the stored energy exceeds a material dependent threshold value and gets released in a sudden jump of the plate. The Burridge-Knopoff (BK) model of earthquakes represents earth's crust as a coupled system of driven oscillators where nonlinearity occurs through a stick-slip frictional instability. Laboratory experiments have revealed that under a high pressure the friction of rock interfaces exhibits a weakening with increasing velocity. In the present project we extend recent theoretical studies of the BK model by taking into account a realistic velocity weakening friction force between tectonic plates. Varying the strength of weakening a broad spectrum of interesting phenomena is obtained: the model reproduces the Omori and Gutenberg-Richter laws of earthquakes, furthermore, it provides information on the correlation of earthquake sequences. We showed by computer simulations that the spatial and temporal correlations of consecutive earthquakes are very

  20. Geology and petrology of the Woods Mountains Volcanic Center, southeastern California: Implications for the genesis of peralkaline rhyolite ash flow tuffs

    Science.gov (United States)

    McCurry, Michael

    1988-12-01

    The Woods Mountains Volcanic Center is a middle Miocene silicic caldera complex located at the transition from the northern to the southern Basin and Range provinces of the western United States. It consists of a trachyte-trachydacite-rhyolite-peralkaline rhyolite association of lava flows, domes, plugs, pyroclastic rocks, and epiclastic breccia. Volcanism began at about 16.4 Ma, near the end of a local resurgence of felsic to intermediate magmatism and associated crustal extension. Numerous metaluminous high-K trachyte, trachydacite, and rhyolite lava flows, domes, and pyroclastic deposits accumulated from vents scattered over an area of 200 km2 forming a broad volcanic field with an initial volume of about 10 km3. At 15.8 Ma, about 80 km3 of metaluminous to mildly peralkaline high-K rhyolite ash flows were erupted from vents in the western part of fhe field in three closely spaced pulses, resulting in the formation of a trap door caldera 10 km in diameter. The ash flows formed the Wild Horse Mesa Tuff, a compositionally zoned ash flow sheet that originally covered an area of about 600 km2 to a maximum thickness of at least 320 m. High-K trachyte pumice lapilli, some of which are intimately banded with rhyolite, were produced late in the two later eruptions, Intracaldera volcanism from widely distributed vents rapidly filled the caldera with about 10 km3 of high-K, mildly peralkaline, high-silica rhyolite lava flows and pyroclastic deposits. These are interlayered with breccia derived from the caldera scarp. They are intruded by numerous compositionally similar plugs, some of which structurally uplifted and fractured the center of the caldera. The center evolved above a high-K trachyte magma chamber about 10 km in diameter that had developed and differentiated within the upper crust at about 15.8 Ma. Petrological, geochemical, and geophysical data are consistent with the idea that a cap of peralkaline rhyolite magma formed within the trachyte chamber as a result

  1. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  2. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  3. a Real-Time Earthquake Moment Tensor Scanning Code for the Antelope System (brtt, Inc)

    Science.gov (United States)

    Macpherson, K. A.; Ruppert, N. A.; Freymueller, J. T.; Lindquist, K.; Harvey, D.; Dreger, D. S.; Lombard, P. N.; Guilhem, A.

    2015-12-01

    While all seismic observatories routinely determine hypocentral location and local magnitude within a few minutes of an earthquake's occurrence, the ability to estimate seismic moment and sense of slip in a similar time frame is less widespread. This is unfortunate, because moment and mechanism are critical parameters for rapid hazard assessment; for larger events, moment magnitude is more reliable due to the tendency of local magnitude to saturate, and certain mechanisms such as off-shore thrust events might indicate earthquakes with tsunamigenic potential. In order to increase access to this capability, we have developed a continuous moment tensor scanning code for Antelope, the ubiquitous open-architecture seismic acquisition and processing software in use around the world. The scanning code, which uses an algorithm that has previously been employed for real-time monitoring at the University of California, Berkeley, is able to produce full moment tensor solutions for moderate events from regional seismic data. The algorithm monitors a grid of potential sources by continuously cross-correlating pre-computed synthetic seismograms with long-period recordings from a sparse network of broad-band stations. The code package consists of 3 modules. One module is used to create a monitoring grid by constructing source-receiver geometry, calling a frequency-wavenumber code to produce synthetics, and computing the generalized linear inverse of the array of synthetics. There is a real-time scanning module that correlates streaming data with pre-inverted synthetics, monitors the variance reduction, and writes the moment tensor solution to a database if an earthquake detection occurs. Finally, there is an 'off-line' module that is very similar to the real-time scanner, with the exception that it utilizes pre-recorded data stored in Antelope databases and is useful for testing purposes or for quickly producing moment tensor catalogs for long time series. The code is open source

  4. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  5. Earthquake Resilient Bridge Columns Utilizing Damage Resistant Hybrid Fiber Reinforced Concrete

    OpenAIRE

    Trono, William Dean

    2014-01-01

    Modern reinforced concrete bridges are designed to avoid collapse and to prevent loss of life during earthquakes. To meet these objectives, bridge columns are typically detailed to form ductile plastic hinges when large displacements occur. California seismic design criteria acknowledges that damage such as concrete cover spalling and reinforcing bar yielding may occur in columns during a design-level earthquake. The seismic resilience of bridge columns can be improved through the use of a da...

  6. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    Science.gov (United States)

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  7. Earthquakes and emergence

    Science.gov (United States)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  8. Earthquake early Warning ShakeAlert system: West coast wide production prototype

    Science.gov (United States)

    Kohler, Monica D.; Cochran, Elizabeth S.; Given, Douglas; Guiwits, Stephen; Neuhauser, Doug; Hensen, Ivan; Hartog, Renate; Bodin, Paul; Kress, Victor; Thompson, Stephen; Felizardo, Claude; Brody, Jeff; Bhadha, Rayo; Schwarz, Stan

    2017-01-01

    Earthquake early warning (EEW) is an application of seismological science that can give people, as well as mechanical and electrical systems, up to tens of seconds to take protective actions before peak earthquake shaking arrives at a location. Since 2006, the U.S. Geological Survey has been working in collaboration with several partners to develop EEW for the United States. The goal is to create and operate an EEW system, called ShakeAlert, for the highest risk areas of the United States, starting with the West Coast states of California, Oregon, and Washington. In early 2016, the Production Prototype v.1.0 was established for California; then, in early 2017, v.1.2 was established for the West Coast, with earthquake notifications being distributed to a group of beta users in California, Oregon, and Washington. The new ShakeAlert Production Prototype was an outgrowth from an earlier demonstration EEW system that began sending test notifications to selected users in California in January 2012. ShakeAlert leverages the considerable physical, technical, and organizational earthquake monitoring infrastructure of the Advanced National Seismic System, a nationwide federation of cooperating seismic networks. When fully implemented, the ShakeAlert system may reduce damage and injury caused by large earthquakes, improve the nation’s resilience, and speed recovery.

  9. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  10. Shalginsk earthquake in Central Kazakhstan of August 22, 2001

    International Nuclear Information System (INIS)

    Mikhajlova, N.N.; Nedelkov, A.I.; Sokolova, I.N.; Kazakov, E.N.; Belyashov, A.V.

    2001-01-01

    The results of study of earthquake, which occurred in Central Kazakhstan on August 22.2001 in the region which traditionally was considered to be aseismic, are presented in the paper. Instrumental and macro-seismic hypocenter parameters and a catalog of aftershocks are given. Geological and tectonic description of the earthquake epicentral region is shown. The conclusions were made about the actual accuracy of epicenter coordinates estimates by different data processing centers. (author)

  11. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  12. DOE SciDAC’s Earth System Grid Center for Enabling Technologies Final Report for University of Southern California Information Sciences Institute

    Energy Technology Data Exchange (ETDEWEB)

    Chervenak, Ann Louise [Univ. of Southern California Information Sciences Inst., Marina del Rey, CA (United States)

    2013-12-19

    The mission of the Earth System Grid Federation (ESGF) is to provide the worldwide climate-research community with access to the data, information, model codes, analysis tools, and intercomparison capabilities required to make sense of enormous climate data sets. Its specific goals are to (1) provide an easy-to-use and secure web-based data access environment for data sets; (2) add value to individual data sets by presenting them in the context of other data sets and tools for comparative analysis; (3) address the specific requirements of participating organizations with respect to bandwidth, access restrictions, and replication; (4) ensure that the data are readily accessible through the analysis and visualization tools used by the climate research community; and (5) transfer infrastructure advances to other domain areas. For the ESGF, the U.S. Department of Energy’s (DOE’s) Earth System Grid Center for Enabling Technologies (ESG-CET) team has led international development and delivered a production environment for managing and accessing ultra-scale climate data. This production environment includes multiple national and international climate projects (such as the Community Earth System Model and the Coupled Model Intercomparison Project), ocean model data (such as the Parallel Ocean Program), observation data (Atmospheric Radiation Measurement Best Estimate, Carbon Dioxide Information and Analysis Center, Atmospheric Infrared Sounder, etc.), and analysis and visualization tools, all serving a diverse user community. These data holdings and services are distributed across multiple ESG-CET sites (such as ANL, LANL, LBNL/NERSC, LLNL/PCMDI, NCAR, and ORNL) and at unfunded partner sites, such as the Australian National University National Computational Infrastructure, the British Atmospheric Data Centre, the National Oceanic and Atmospheric Administration Geophysical Fluid Dynamics Laboratory, the Max Planck Institute for Meteorology, the German Climate Computing

  13. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  14. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  15. Real-time earthquake monitoring: Early warning and rapid response

    Science.gov (United States)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  16. Earthquake magnitude time series: scaling behavior of visibility networks

    Science.gov (United States)

    Aguilar-San Juan, B.; Guzmán-Vargas, L.

    2013-11-01

    We present a statistical analysis of earthquake magnitude sequences in terms of the visibility graph method. Magnitude time series from Italy, Southern California, and Mexico are transformed into networks and some organizational graph properties are discussed. Connectivities are characterized by a scale-free distribution with a noticeable effect for large scales due to either the presence or the lack of large events. Also, a scaling behavior is observed between different node measures like betweenness centrality, clustering coefficient, nearest neighbor connectivity, and earthquake magnitude. Moreover, parameters which quantify the difference between forward and backward links, are proposed to evaluate the asymmetry of visibility attachment mechanism. Our results show an alternating average behavior of these parameters as earthquake magnitude changes. Finally, we evaluate the effects of reducing temporal and spatial windows of observation upon visibility network properties for main-shocks.

  17. Hospital disaster operations during the 1989 Loma Prieta earthquake.

    Science.gov (United States)

    Martchenke, J; Pointer, J E

    1994-01-01

    To study hospital disaster operations following a major United States disaster. Researchers interviewed all 51 hospital administrators and 49 of 51 emergency department (ED) charge nurses and emergency physicians who were on duty at the study hospitals during the 13-hour period immediately following the 1989 Loma Prieta earthquake. The 51 acute-care hospitals in the six northern California counties most affected by the Loma Prieta earthquake. Questionnaires and in-person interviews. The most frequently noted problem was lack of communications within and among organizations. Hospitals received inadequate information about the disaster from local governmental agencies. Forty-three percent of hospitals had inadequate back-up power configurations, and five hospitals sustained total back-up generator failures. Twenty hospitals performed partial evacuations. The Loma Prieta earthquake did not cause total disruption of hospital services. Hospitals need to work with local governmental agencies and internal hospital departments to improve disaster communications.

  18. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  19. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  20. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modelling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial valley earthquake in California (USA). The results of the study indicate that while all three approaches can succesfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  1. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    1997-01-01

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modeling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial Valley earthquake in California (U .S .A.). The results of the study indicate that while all three approaches can successfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  2. ShakeAlert—An earthquake early warning system for the United States west coast

    Science.gov (United States)

    Burkett, Erin R.; Given, Douglas D.; Jones, Lucile M.

    2014-08-29

    Earthquake early warning systems use earthquake science and the technology of monitoring systems to alert devices and people when shaking waves generated by an earthquake are expected to arrive at their location. The seconds to minutes of advance warning can allow people and systems to take actions to protect life and property from destructive shaking. The U.S. Geological Survey (USGS), in collaboration with several partners, has been working to develop an early warning system for the United States. ShakeAlert, a system currently under development, is designed to cover the West Coast States of California, Oregon, and Washington.

  3. Observations of an ionospheric perturbation arising from the Coalinga earthquake of May 2, 1983

    International Nuclear Information System (INIS)

    Wolcott, J.H.; Simons, D.J.; Lee, D.D.; Nelson, R.A.

    1984-01-01

    An ionospheric perturbation that was produced by the Coalinga earthquake of May 2, 1983, was detected by a network of high-frequency radio links in northern California. The ionospheric refraction regions of all five HF propagation paths, at distances between 160 and 285 km (horizontal range) from the epicenter, were affected by a ground-motion-induced acoustic pulse that propagated to ionospheric heights. The acoustic pulse was produced by the earthquake-induced seismic waves rather than the vertical ground motion above the epicenter. These observations appear to be the first ionospheric disturbances to be reported this close to an earthquake epicenter

  4. Baja California: literatura y frontera

    Directory of Open Access Journals (Sweden)

    Gabriel Trujillo Muñoz

    2014-06-01

    Baja California is a region that not only has migration problems and criminal violence because of the war of drugs or is a space of border conflicts in close neighborhood with the United States of America. Baja California is too a geographic space of culture and art, of creative writing and struggle to narrate the things and persons that here live, a plain sight, like their house, like their home, like a center of creation. This text give a cultural context of the border literature in the north of Mexico like a phenomenon in notice because his own merits, books and writers.

  5. Injection-induced earthquakes.

    Science.gov (United States)

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  6. Geologic Wonders of Yosemite at Two Miles High: an Undergraduate, Learner-Centered, Team Research Program at the University of Southern California

    Science.gov (United States)

    Wagner, R.; Anderson, J. L.; Cao, W.; Gao, Y.; Ikeda, T.; Jacobs, R.; Johanesen, K.; Mai, J.; Memeti, V.; Padilla, A.; Paterson, S. R.; Seyum, S.; Shimono, S.; Thomas, T.; Thompson, J.; Zhang, T.

    2007-12-01

    This program is a multidisciplinary student research experience that is largely outside of the classroom, involving undergraduate students in an international-level research project looking at the magmatic plumbing systems formed underneath volcanoes. We bring together a blend of students across the disciplines, both from within and outside the sciences. Following a "learner-centered" teaching philosophy, we formed student teams where more advanced students worked with and taught those more junior, under the guidance of mentors, which include USC professors, graduate students, and visiting international scholars. This program truly covers the full breadth of the research process, from field work and data collection to analysis to presentation. In the summers of 2006 and 2007, research groups of undergraduates and mentors camped in the high Sierra backcountry and worked in small mapping groups by day, generating a detailed geologic map of the field area. Evenings consisted of student led science meetings where the group discussed major research problems and developed a plan to address them. Upon returning from the field, the research group transitions to more lab- based work, including rock dating, XRF geochemistry, microscope, and mineral microprobe analyses, and by spring semester the groups also begins writing up and presenting the results. The summer 2006 research group consisted of 5 undergraduate students and 5 mentors, and was a huge success resulting in presentations at a university undergraduate research symposium as well as the Cordilleran Section meeting of GSA. The summer 2007 group was even larger, with 10 undergraduates and 6 mentors, including two international scholars. Undergraduates also participated in research in China and Mongolia. Aside from rewarding research experiences, students learn rapidly through these research experiences, were much more engaged in the learning process, and benefited from teaching their peers. Several students expressed

  7. Periodic, chaotic, and doubled earthquake recurrence intervals on the deep San Andreas fault.

    Science.gov (United States)

    Shelly, David R

    2010-06-11

    Earthquake recurrence histories may provide clues to the timing of future events, but long intervals between large events obscure full recurrence variability. In contrast, small earthquakes occur frequently, and recurrence intervals are quantifiable on a much shorter time scale. In this work, I examine an 8.5-year sequence of more than 900 recurring low-frequency earthquake bursts composing tremor beneath the San Andreas fault near Parkfield, California. These events exhibit tightly clustered recurrence intervals that, at times, oscillate between approximately 3 and approximately 6 days, but the patterns sometimes change abruptly. Although the environments of large and low-frequency earthquakes are different, these observations suggest that similar complexity might underlie sequences of large earthquakes.

  8. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  9. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  10. Aseismic blocks and destructive earthquakes in the Aegean

    Science.gov (United States)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  11. CISN Earthquake Early Warning: ShakeAlert Hybrid Branch

    Science.gov (United States)

    Brown, H.; Lim, I.; Allen, R. M.; Böse, M.; Cua, G. B.; Heaton, T. H.; Cisn Earthquake Early Warning Project Team

    2010-12-01

    The California Integrated Seismic Network (CISN) is developing an integrated, statewide earthquake early warning (EEW) system for California. In summer 2009 the CISN completed a three-year proof-of-concept study, analyzing three EEW algorithms for viability in California: (1) Onsite, run by the California Institute of Technology, (2) Virtual Seismologist, run by the Swiss Seismological Service, and (3) ElarmS, run by the University of California at Berkeley. The study successfully detected earthquakes and accurately predicted the resulting ground shaking. As of December 2010 the CISN EEW team is halfway through a second three-year project to build an end-to-end prototype early warning system capable of delivering warning to a small group of test users. This new system is called CISN ShakeAlert. An area of ongoing research is the Hybrid Branch: a new, integrated algorithm to calculate event magnitude and location in realtime. The Hybrid Branch takes advantage of the best aspects of each of the original test algorithms. The Hybrid Branch will be able to rapidly recognize and assess an event using only a single station’s P-wave data, as OnSite does, but it will also combine data from multiple stations in a network-based approach, as Virtual Seismologist and ElarmS do. This will give the Hybrid Branch the speed of a single-station EEW method with the reliability of a multi-station method. One of the challenges of the Hybrid Branch is how to progress from a single station description of a given event to a multi-station view of the same event. The authors use a Bayesian approach to combine event information and adapt to changing data availability. Output from the Hybrid Branch will be sent to the ShakeAlert Decision Module, which consolidates event information from a variety of sources and generates earthquake alerts.

  12. California Political Districts

    Data.gov (United States)

    California Department of Resources — This is a series of district layers pertaining to California'spolitical districts, that are derived from the California State Senateand State Assembly information....

  13. California Political Districts

    Data.gov (United States)

    California Natural Resource Agency — This is a series of district layers pertaining to California'spolitical districts, that are derived from the California State Senateand State Assembly information....

  14. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  15. Magnitude 8.1 Earthquake off the Solomon Islands

    Science.gov (United States)

    2007-01-01

    On April 1, 2007, a magnitude 8.1 earthquake rattled the Solomon Islands, 2,145 kilometers (1,330 miles) northeast of Brisbane, Australia. Centered less than ten kilometers beneath the Earth's surface, the earthquake displaced enough water in the ocean above to trigger a small tsunami. Though officials were still assessing damage to remote island communities on April 3, Reuters reported that the earthquake and the tsunami killed an estimated 22 people and left as many as 5,409 homeless. The most serious damage occurred on the island of Gizo, northwest of the earthquake epicenter, where the tsunami damaged the hospital, schools, and hundreds of houses, said Reuters. This image, captured by the Landsat-7 satellite, shows the location of the earthquake epicenter in relation to the nearest islands in the Solomon Island group. Gizo is beyond the left edge of the image, but its triangular fringing coral reefs are shown in the upper left corner. Though dense rain forest hides volcanic features from view, the very shape of the islands testifies to the geologic activity of the region. The circular Kolombangara Island is the tip of a dormant volcano, and other circular volcanic peaks are visible in the image. The image also shows that the Solomon Islands run on a northwest-southeast axis parallel to the edge of the Pacific plate, the section of the Earth's crust that carries the Pacific Ocean and its islands. The earthquake occurred along the plate boundary, where the Australia/Woodlark/Solomon Sea plates slide beneath the denser Pacific plate. Friction between the sinking (subducting) plates and the overriding Pacific plate led to the large earthquake on April 1, said the United States Geological Survey (USGS) summary of the earthquake. Large earthquakes are common in the region, though the section of the plate that produced the April 1 earthquake had not caused any quakes of magnitude 7 or larger since the early 20th century, said the USGS.

  16. Georesistivity precursors to the Tangshan earthquake of 1976

    Directory of Open Access Journals (Sweden)

    F. Lu

    1997-06-01

    Full Text Available Georesistivity precursors and corresponding coseismic effects to the Tangshan earthquake of 1976 are given as follows: 1 resistivity measurements with accuracies of 0.5% or better for over 20 years show that resistivity decreases of several percent, which began approximately 3 years prior to the Tangshan earthquake, were larger than the background fluctuations and hence statistically significant. An outstanding example of an intermediate-term resistivity precursor is given. 2 Georesistivity decreases of several percent observed simultaneously at 9 stations beginning 2-3 years prior to the 1976 Tangshan earthquake are such a pervasive phenomenon that the mean decrease, in percent, can be contoured on a map of the Beijing-Tianjin-Tangshan region. This shows the maximum decrease centered over the epicenter. 3 Corresponding coseismic resistivity changes, ??c/?c, during the M 7.8 Tangshan earthquake were observed at all 16 stations within 240 km from the epicentre. These observed ??c/?c are opposite in sense but similar in spatial distribution to corresponding georesistivity precursors. This observation suggests that the Tangshan earthquake is a rebound process. Calculation indicates that these georesistivity precursors could be represented by a virtual dislocation, of opposite sign to the real dislocation produced at the time of the Tangshan earthquake. These reported ??c/?c offer very convincing evidence for accepting corresponding anomalies prior to the earthquake as its precursors. 4 It is inferred from observed anisotropic decreases in georesistivity that before the Tangshan earthquake the crust was compressed and that the angle between the maximum principal stress ?1 and the earthquake fault was about 80° before the earthquake i.e., the fault was locked by the ?1 which is almost normal to the fault.

  17. Clustering and periodic recurrence of microearthquakes on the san andreas fault at parkfield, california.

    Science.gov (United States)

    Nadeau, R M; Foxall, W; McEvilly, T V

    1995-01-27

    The San Andreas fault at Parkfield, California, apparently late in an interval between repeating magnitude 6 earthquakes, is yielding to tectonic loading partly by seismic slip concentrated in a relatively sparse distribution of small clusters (<20-meter radius) of microearthquakes. Within these clusters, which account for 63% of the earthquakes in a 1987-92 study interval, virtually identical small earthquakes occurred with a regularity that can be described by the statistical model used previously in forecasting large characteristic earthquakes. Sympathetic occurrence of microearthquakes in nearby clusters was observed within a range of about 200 meters at communication speeds of 10 to 100 centimeters per second. The rate of earthquake occurrence, particularly at depth, increased significantly during the study period, but the fraction of earthquakes that were cluster members decreased.

  18. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  19. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  20. Refining Southern California Geotherms Using Seismologic, Geologic, and Petrologic Constraints

    Science.gov (United States)

    Thatcher, W. R.; Chapman, D. S.; Allam, A. A.; Williams, C. F.

    2017-12-01

    Lithospheric deformation in tectonically active regions depends on the 3D distribution of rheology, which is in turn critically controlled by temperature. Under the auspices of the Southern California Earthquake Center (SCEC) we are developing a 3D Community Thermal Model (CTM) to constrain rheology and so better understand deformation processes within this complex but densely monitored and relatively well-understood region. The San Andreas transform system has sliced southern California into distinct blocks, each with characteristic lithologies, seismic velocities and thermal structures. Guided by the geometry of these blocks we use more than 250 surface heat-flow measurements to define 13 geographically distinct heat flow regions (HFRs). Model geotherms within each HFR are constrained by averages and variances of surface heat flow q0 and the 1D depth distribution of thermal conductivity (k) and radiogenic heat production (A), which are strongly dependent on rock type. Crustal lithologies are not always well known and we turn to seismic imaging for help. We interrogate the SCEC Community Velocity Model (CVM) to determine averages and variances of Vp, Vs and Vp/Vs versus depth within each HFR. We bound (A, k) versus depth by relying on empirical relations between seismic wave speed and rock type and laboratory and modeling methods relating (A, k) to rock type. Many 1D conductive geotherms for each HFR are allowed by the variances in surface heat flow and subsurface (A, k). An additional constraint on the lithosphere temperature field is provided by comparing lithosphere-asthenosphere boundary (LAB) depths identified seismologically with those defined thermally as the depth of onset of partial melting. Receiver function studies in Southern California indicate LAB depths that range from 40 km to 90 km. Shallow LAB depths are correlated with high surface heat flow and deep LAB with low heat flow. The much-restricted families of geotherms that intersect peridotite

  1. Energy efficient data centers

    Energy Technology Data Exchange (ETDEWEB)

    Tschudi, William; Xu, Tengfang; Sartor, Dale; Koomey, Jon; Nordman, Bruce; Sezgen, Osman

    2004-03-30

    Data Center facilities, prevalent in many industries and institutions are essential to California's economy. Energy intensive data centers are crucial to California's industries, and many other institutions (such as universities) in the state, and they play an important role in the constantly evolving communications industry. To better understand the impact of the energy requirements and energy efficiency improvement potential in these facilities, the California Energy Commission's PIER Industrial Program initiated this project with two primary focus areas: First, to characterize current data center electricity use; and secondly, to develop a research ''roadmap'' defining and prioritizing possible future public interest research and deployment efforts that would improve energy efficiency. Although there are many opinions concerning the energy intensity of data centers and the aggregate effect on California's electrical power systems, there is very little publicly available information. Through this project, actual energy consumption at its end use was measured in a number of data centers. This benchmark data was documented in case study reports, along with site-specific energy efficiency recommendations. Additionally, other data center energy benchmarks were obtained through synergistic projects, prior PG&E studies, and industry contacts. In total, energy benchmarks for sixteen data centers were obtained. For this project, a broad definition of ''data center'' was adopted which included internet hosting, corporate, institutional, governmental, educational and other miscellaneous data centers. Typically these facilities require specialized infrastructure to provide high quality power and cooling for IT equipment. All of these data center types were considered in the development of an estimate of the total power consumption in California. Finally, a research ''roadmap'' was developed

  2. Analysis of rupture area of aftershocks caused by twin earthquakes (Case study: 11 April 2012 earthquakes of Aceh-North Sumatra)

    International Nuclear Information System (INIS)

    Diansari, Angga Vertika; Purwana, Ibnu; Subakti, Hendri

    2015-01-01

    The 11 April 2012 earthquakes off-shore Aceh-North Sumatra are unique events for the history of Indonesian earthquake. It is unique because that they have similar magnitude, 8.5 Mw and 8.1 Mw; close to epicenter distance, similar strike-slip focal mechanism, and occuring in outer rise area. The purposes of this research are: (1) comparing area of earthquakes base on models and that of calculation, (2) fitting the shape and the area of earthquake rupture zones, (3) analyzing the relationship between rupture area and magnitude of the earthquakes. Rupture area of the earthquake fault are determined by using 4 different formulas, i.e. Utsu and Seki (1954), Wells and Coppersmith (1994), Ellsworth (2003), and Christophersen and Smith (2000). The earthquakes aftershock parameters are taken from PGN (PusatGempabumiNasional or National Earthquake Information Center) of BMKG (Indonesia Agency Meteorology Climatology and Geophysics). The aftershock epicenters are plotted by GMT’s software. After that, ellipse and rectangular models of aftershock spreading are made. The results show that: (1) rupture areas were calculated using magnitude relationship which are larger than the the aftershock distributions model, (2) the best fitting model for that earthquake aftershock distribution is rectangular associated with Utsu and Seki (1954) formula, (3) the larger the magnitude of the earthquake, the larger area of the fault

  3. Analysis of rupture area of aftershocks caused by twin earthquakes (Case study: 11 April 2012 earthquakes of Aceh-North Sumatra)

    Energy Technology Data Exchange (ETDEWEB)

    Diansari, Angga Vertika, E-mail: anggav.bmkg@gmail.com; Purwana, Ibnu; Subakti, Hendri [Academy of Meteorology and Geophysics, Jalan Perhubungan I no.5 Tangerang 15221 (Indonesia)

    2015-04-24

    The 11 April 2012 earthquakes off-shore Aceh-North Sumatra are unique events for the history of Indonesian earthquake. It is unique because that they have similar magnitude, 8.5 Mw and 8.1 Mw; close to epicenter distance, similar strike-slip focal mechanism, and occuring in outer rise area. The purposes of this research are: (1) comparing area of earthquakes base on models and that of calculation, (2) fitting the shape and the area of earthquake rupture zones, (3) analyzing the relationship between rupture area and magnitude of the earthquakes. Rupture area of the earthquake fault are determined by using 4 different formulas, i.e. Utsu and Seki (1954), Wells and Coppersmith (1994), Ellsworth (2003), and Christophersen and Smith (2000). The earthquakes aftershock parameters are taken from PGN (PusatGempabumiNasional or National Earthquake Information Center) of BMKG (Indonesia Agency Meteorology Climatology and Geophysics). The aftershock epicenters are plotted by GMT’s software. After that, ellipse and rectangular models of aftershock spreading are made. The results show that: (1) rupture areas were calculated using magnitude relationship which are larger than the the aftershock distributions model, (2) the best fitting model for that earthquake aftershock distribution is rectangular associated with Utsu and Seki (1954) formula, (3) the larger the magnitude of the earthquake, the larger area of the fault.

  4. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  5. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  6. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  7. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  8. Stress modulation of earthquakes: A study of long and short period stress perturbations and the crustal response

    Science.gov (United States)

    Johnson, Christopher W.

    Decomposing fault mechanical processes advances our understanding of active fault systems and properties of the lithosphere, thereby increasing the effectiveness of seismic hazard assessment and preventative measures implemented in urban centers. Along plate boundaries earthquakes are inevitable as tectonic forces reshape the Earth's surface. Earthquakes, faulting, and surface displacements are related systems that require multidisciplinary approaches to characterize deformation in the lithosphere. Modern geodetic instrumentation can resolve displacements to millimeter precision and provide valuable insight into secular deformation in near real-time. The expansion of permanent seismic networks as well as temporary deployments allow unprecedented detection of microseismic events that image fault interfaces and fracture networks in the crust. The research presented in this dissertation is at the intersection of seismology and geodesy to study the Earth's response to transient deformation and explores research questions focusing on earthquake triggering, induced seismicity, and seasonal loading while utilizing seismic data, geodetic data, and modeling tools. The focus is to quantify stress changes in the crust, explore seismicity rate variations and migration patterns, and model crustal deformation in order to characterize the evolving state of stress on faults and the migration of fluids in the crust. The collection of problems investigated all investigate the question: Why do earthquakes nucleate following a low magnitude stress perturbation? Answers to this question are fundamental to understanding the time dependent failure processes of the lithosphere. Dynamic triggering is the interaction of faults and triggering of earthquakes represents stress transferring from one system to another, at both local and remote distances [Freed, 2005]. The passage of teleseismic surface waves from the largest earthquakes produce dynamic stress fields and provides a natural

  9. InSAR Analysis of the 2011 Hawthorne (Nevada) Earthquake Swarm: Implications of Earthquake Migration and Stress Transfer

    Science.gov (United States)

    Zha, X.; Dai, Z.; Lu, Z.

    2015-12-01

    The 2011 Hawthorne earthquake swarm occurred in the central Walker Lane zone, neighboring the border between California and Nevada. The swarm included an Mw 4.4 on April 13, Mw 4.6 on April 17, and Mw 3.9 on April 27. Due to the lack of the near-field seismic instrument, it is difficult to get the accurate source information from the seismic data for these moderate-magnitude events. ENVISAT InSAR observations captured the deformation mainly caused by three events during the 2011 Hawthorne earthquake swarm. The surface traces of three seismogenic sources could be identified according to the local topography and interferogram phase discontinuities. The epicenters could be determined using the interferograms and the relocated earthquake distribution. An apparent earthquake migration is revealed by InSAR observations and the earthquake distribution. Analysis and modeling of InSAR data show that three moderate magnitude earthquakes were produced by slip on three previously unrecognized faults in the central Walker Lane. Two seismogenic sources are northwest striking, right-lateral strike-slip faults with some thrust-slip components, and the other source is a northeast striking, thrust-slip fault with some strike-slip components. The former two faults are roughly parallel to each other, and almost perpendicular to the latter one. This special spatial correlation between three seismogenic faults and nature of seismogenic faults suggest the central Walker Lane has been undergoing southeast-northwest horizontal compressive deformation, consistent with the region crustal movement revealed by GPS measurement. The Coulomb failure stresses on the fault planes were calculated using the preferred slip model and the Coulomb 3.4 software package. For the Mw4.6 earthquake, the Coulomb stress change caused by the Mw4.4 event increased by ~0.1 bar. For the Mw3.9 event, the Coulomb stress change caused by the Mw4.6 earthquake increased by ~1.0 bar. This indicates that the preceding

  10. Earthquake safety program at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Freeland, G.E.

    1985-01-01

    Within three minutes on the morning of January 24, 1980, an earthquake and three aftershocks, with Richter magnitudes of 5.8, 5.1, 4.0, and 4.2, respectively, struck the Livermore Valley. Two days later, a Richter magnitude 5.4 earthquake occurred, which had its epicenter about 4 miles northwest of the Lawrence Livermore National Laboratory (LLNL). Although no one at the Lab was seriously injured, these earthquakes caused considerable damage and disruption. Masonry and concrete structures cracked and broke, trailers shifted and fell off their pedestals, office ceilings and overhead lighting fell, and bookcases overturned. The Laboratory was suddenly immersed in a site-wide program of repairing earthquake-damaged facilities, and protecting our many employees and the surrounding community from future earthquakes. Over the past five years, LLNL has spent approximately $10 million on its earthquake restoration effort for repairs and upgrades. The discussion in this paper centers upon the earthquake damage that occurred, the clean-up and restoration efforts, the seismic review of LLNL facilities, our site-specific seismic design criteria, computer-floor upgrades, ceiling-system upgrades, unique building seismic upgrades, geologic and seismologic studies, and seismic instrumentation. 10 references

  11. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    van der Elst, Nicholas J.

    2017-11-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  12. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  13. Preliminary Results from SCEC Earthquake Simulator Comparison Project

    Science.gov (United States)

    Tullis, T. E.; Barall, M.; Richards-Dinger, K. B.; Ward, S. N.; Heien, E.; Zielke, O.; Pollitz, F. F.; Dieterich, J. H.; Rundle, J. B.; Yikilmaz, M. B.; Turcotte, D. L.; Kellogg, L. H.; Field, E. H.

    2010-12-01

    realistic fault geometies and slip rates taken from California, excluding the Cascadia subduction zone. In order to make as close comparisons between the simulators as possible we have developed shared data formats for both input and output and a growing set of tools that can be used to make statistical comparisons between the simulator outputs. To date all five simulators have run a Northern California fault model and are in various stages of working on an All California fault model. The plan in the near future is to run them on the UCERF2 fault model. Initial comparisons show significant differences among the simulators and some differences from observed earthquake statistics. However, it is too early in the process to infer too much from these preliminary results. For example, the differences in how each simulator treats fault friction means that they may each need to use values for the assumed stress drops that are better tuned to their approach than are the common values used in the first comparison.

  14. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 17. Interpretation of Strong Ground Motion Records.

    Science.gov (United States)

    1981-10-01

    RECORDS by Bruce A. Bolt Department of Geology and Geophysics . University of California Berkeley, Clif . 94720 ELEG-E Ckbw 191 iDEC 1 6 1981 Report 17 oF a...The Parkfield, California earthquake of June 27, 1966, preliminary seismological and engineering seismological report, U.S. Coast . Geodetic Surv. Das

  15. Seismic risk analysis for General Electric Plutonium Facility, Pleasanton, California. Final report, part II

    International Nuclear Information System (INIS)

    1980-01-01

    This report is the second of a two part study addressing the seismic risk or hazard of the special nuclear materials (SNM) facility of the General Electric Vallecitos Nuclear Center at Pleasanton, California. The Part I companion to this report, dated July 31, 1978, presented the seismic hazard at the site that resulted from exposure to earthquakes on the Calaveras, Hayward, San Andreas and, additionally, from smaller unassociated earthquakes that could not be attributed to these specific faults. However, while this study was in progress, certain additional geologic information became available that could be interpreted in terms of the existance of a nearby fault. Although substantial geologic investigations were subsequently deployed, the existance of this postulated fault, called the Verona Fault, remained very controversial. The purpose of the Part II study was to assume the existance of such a capable fault and, under this assumption, to examine the loads that the fault could impose on the SNM facility. This report first reviews the geologic setting with a focus on specifying sufficient geologic parameters to characterize the postulated fault. The report next presents the methodology used to calculate the vibratory ground motion hazard. Because of the complexity of the fault geometry, a slightly different methodology is used here compared to the Part I report. This section ends with the results of the calculation applied to the SNM facility. Finally, the report presents the methodology and results of the rupture hazard calculation

  16. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  17. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  18. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  19. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  20. Is there a relationship between solar activity and earthquakes?

    Science.gov (United States)

    L'Huissier, P.; Dominguez, M.; Gallo, N.; Tapia, M.; Pinto, V. A.; Moya, P. S.; Stepanova, M. V.; Munoz, V.; Rogan, J.; Valdivia, J. A.

    2012-12-01

    Several statistical studies have suggested a connection between solar and geomagnetic activity, and seismicity. Some studies claim there are global effects, relating solar activity, for instance, with earthquake occurrence on the Earth. Other studies intend to find effects on a local scale, where perturbations in the geomagnetic activity are followed by seismic events. We intend to investigate this issue by means of a surrogates method. First, we analyze the statistical validity of reported correlations between the number of sunspots and the annual number of earthquakes during the last century. On the other hand, in relation to local geomagnetic variations prior to an important earthquake, we carry out a study of the magnetic field fluctuations using the SAMBA array in a window of two years centered in the February 27th, 2010 M = 8.8 earthquake at Chile. We expect these studies to be useful in order to find measurable precursors before an important seismic event.

  1. Seismological investigation of earthquakes in the New Madrid Seismic Zone

    International Nuclear Information System (INIS)

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35 degrees--39 degrees N and longitudes 87 degrees--92 degrees W. Most of these earthquakes occur within a 1.5 degrees x 2 degrees zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions

  2. PAGER--Rapid assessment of an earthquake?s impact

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  3. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  4. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  5. Food habits studies of Steller sea lions in Washington, California conducted by Alaska Fisheries Science Center, National Marine Mammal Laboratory from 1993-05-01 to 1999-10-01 (NCEI Accession 0145304)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — From 1993 to 1999, The National Marine Mammal Laboratories' California Current Ecosystem Program (AFSC/NOAA) collected fecal samples from Steller sea lions in...

  6. Food habit studies of pinnipeds conducted at San Miguel Island, California by Alaska Fisheries Science Center, National Marine Mammal Laboratory from 1980-02-01 to 2014-01-31 (NCEI Accession 0145166)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The National Marine Mammal Laboratories' California Current Ecosystem Program (AFSC/NOAA) collects fecal samples to examine the diet of pinnipeds, including...

  7. Earthquake and ambient vibration monitoring of the steel frame UCLA Factor building

    OpenAIRE

    Kohler, Monica D.; Davis, Paul M.; Safak, Erdal

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (M_L =2.9), the 3 September 2002 Yorba Linda, California (M_L=4.7), and the 3 November 2002 Central Alaska (M_w=7.9) earthquak...

  8. Structure and Velocities of the Northeastern Santa Cruz Mountains and the Western Santa Clara Valley, California, from the SCSI-LR Seismic Survey

    Science.gov (United States)

    Catchings, R.D.; Goldman, M.R.; Gandhok, G.

    2006-01-01

    Introduction: The Santa Clara Valley is located in the southern San Francisco Bay area of California and generally includes the area south of the San Francisco Bay between the Santa Cruz Mountains on the southwest and the Diablo Ranges on the northeast. The area has a population of approximately 1.7 million including the city of San Jose, numerous smaller cities, and much of the high-technology manufacturing and research area commonly referred to as the Silicon Valley. Major active strands of the San Andreas Fault system bound the Santa Clara Valley, including the San Andreas fault to the southwest and the Hayward and Calaveras faults to the northeast; related faults likely underlie the alluvium of the valley. This report focuses on subsurface structures of the western Santa Clara Valley and the northeastern Santa Cruz Mountains and their potential effects on earthquake hazards and ground-water resource management in the area. Earthquake hazards and ground-water resources in the Santa Clara Valley are important considerations to California and the Nation because of the valley's preeminence as a major technical and industrial center, proximity to major earthquakes faults, and large population. To assess the earthquake hazards of the Santa Clara Valley better, the U.S. Geological Survey (USGS) has undertaken a program to evaluate potential earthquake sources and potential effects of strong ground shaking within the valley. As part of that program, and to better assess water resources of the valley, the USGS and the Santa Clara Valley Water District (SCVWD) began conducting collaborative studies to characterize the faults, stratigraphy, and structures beneath the alluvial cover of the Santa Clara Valley in the year 2000. Such geologic features are important to both agencies because they directly influence the availability and management of groundwater resources in the valley, and they affect the severity and distribution of strong shaking from local or regional

  9. Seismicity and focal mechanisms for the southern Great Basin of Nevada and California: 1987 through 1989

    Energy Technology Data Exchange (ETDEWEB)

    Harmsen, S.C.; Bufe, C.G.

    1991-12-31

    For the calendar year 1987, the southern Great basin seismic network (SGBSN) recorded about 820 earthquakes in the southern Great Basin (SGB). Local magnitudes ranged from 0.2 to 4.2 (December 30, 1987, 22:50:42 UTC at Hot Creek Valley). Five earthquakes epicenters in 1987 within the detection threshold of the seismic network are at Yucca Mountain, the site of a potential national, high-level nuclear waste repository. The maximum magnitude of those five earthquakes is 1.1, and their estimated depths of focus ranged from 3.1 to 7.6 km below sea level. For the calendar year 1988, about 1280 SGB earthquakes were catalogued, with maximum magnitude-4.4 for an Owens Valley, California, earthquake on July 5, 1988. Eight earthquake epicenters in 1988 are at Yucca Mountain, with depths ranging from three to 12 km below sea level, and maximum magnitude 2.1. For the calendar year 1989, about 1190 SGB earthquakes were located and catalogued, with maximum magnitude equal to 3.5 for earthquake about ten miles north of Las Vegas, Nevada, on January 9. No Yucca Mountain earthquakes were recorded in 1989. An earthquake having a well-constrained depth of about 30 km below sea level was observed on August 21, 1989, in eastern Nevada Test Site (NTS).

  10. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  11. Neural Network Methodology for Earthquake Early Warning - first applications

    Science.gov (United States)

    Wenzel, F.; Koehler, N.; Cua, G.; Boese, M.

    2007-12-01

    PreSEIS is a method for earthquake early warning for finite faults (Böse, 2006) that is based on Artificial Neural Networks (ANN's), which are used for the mapping of seismic observations onto likely source parameters, including the moment magnitude and the location of an earthquake. PreSEIS integrates all available information on ground shaking at different sensors in a seismic network and up-dates the estimates of seismic source parameters regularly with proceeding time. PreSEIS has been developed and tested with synthetic waveform data using the example of Istanbul, Turkey (Böse, 2006). We will present first results of the application of PreSEIS to real data from Southern California, recorded at stations from the Southern California Seismic Network. The dataset consists of 69 shallow local earthquakes with moment magnitudes ranging between 1.96 and 7.1. The data come from broadband (20 or 40 Hz) or high broadband (80 or 100 Hz), high gain channels (3-component). The Southern California dataset will allow a comparison of our results to those of the Virtual Seismologist (Cua, 2004). We used the envelopes of the waveforms defined by Cua (2004) as input for the ANN's. The envelopes were obtained by taking the maximum absolute amplitude value of the recorded ground motion time history over a 1-second time window. Due to the fact that not all of the considered stations have recorded each earthquake, the missing records were replaced by synthetic envelopes, calculated by envelope attenuation relationships developed by Cua (2004).

  12. Lecture Demonstrations on Earthquakes for K-12 Teachers and Students

    Science.gov (United States)

    Dry, M. D.; Patterson, G. L.

    2005-12-01

    Lecture Demonstrations on Earthquakes for K-12 Teachers and Students Since 1975, the Center for Earthquake Research and Information, (CERI), at The University of Memphis, has strived to satisfy its information transfer directives through diverse education and outreach efforts, providing technical and non-technical earthquake information to the general public, K-16 teachers and students, professional organizations, and state and federal organizations via all forms of written and electronic communication. Through these education and outreach efforts, CERI tries to increase earthquake hazard awareness to help limit future losses. In the past three years, education programs have reached over 20,000 K-16 students and teachers through in-service training workshops for teachers and earthquake/earth science lecture demonstrations for students. The presentations include an hour-long lecture demonstration featuring graphics and an informal question and answer format. Graphics used include seismic hazard maps, damage photos, plate tectonic maps, layers of the Earth, and more, all adapted for the audience. Throughout this presentation, manipulatives such as a Slinky, Silly Putty, a foam Earth with depth and temperature features, and Popsicle sticks are used to demonstrate seismic waves, the elasticity of the Earth, the Earth's layers and their features, and the brittleness of the crust. Toward the end, a demonstration featuring a portable shake table with a dollhouse mounted on it is used to illustrate earthquake-shaking effects. This presentation is also taken to schools when they are unable to visit CERI. Following this presentation, groups are then taken to the Public Earthquake Resource Center at CERI, a space featuring nine displays, seven of which are interactive. The interactive displays include a shake table and building blocks, a trench with paleoliquefaction features, computers with web access to seismology sites, a liquefaction model, an oscilloscope and attached

  13. Seismic Imaging of the West Napa Fault in Napa, California

    Science.gov (United States)

    Goldman, M.; Catchings, R.; Chan, J. H.; Sickler, R. R.; Nevitt, J. M.; Criley, C.

    2017-12-01

    In October 2016, we acquired high-resolution P- and S-wave seismic data along a 120-m-long, SW-NE-trending profile in Napa, California. Our seismic survey was designed to image a strand of the West Napa Fault Zone (WNFZ), which ruptured during the 24 August 2014 Mw 6.0 South Napa Earthquake. We separately acquired P- and S-wave data at every station using multiple hammer hits, which were edited and stacked into individual shot gathers in the lab. Each shot was co-located with and recorded by 118 P-wave (40-Hz) geophones, spaced at 1 m, and by 180 S-wave (4.5-Hz) geophones, spaced at 1 m. We developed both P- and S-wave tomographic velocity models, as well as Poisson's ratio and a Vp/Vs ratio models. We observed a well-defined zone of elevated Vp/Vs ratios below about 10 m depth, centered beneath the observed surface rupture. P-wave reflection images show that the fault forms a flower-structure in the upper few tens of meters. This method has been shown to delineate fault structures even in areas of rough terrain.

  14. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  15. Discussing epigenetics in Southern California

    Science.gov (United States)

    2012-01-01

    With the goal of discussing how epigenetic control and chromatin remodeling contribute to the various processes that lead to cellular plasticity and disease, this symposium marks the collaboration between the Institut National de la Santé et de la Recherche Médicale (INSERM) in France and the University of California, Irvine (UCI). Organized by Paolo Sassone-Corsi (UCI) and held at the Beckman Center of the National Academy of Sciences at the UCI campus December 15–16, 2011, this was the first of a series of international conferences on epigenetics dedicated to the scientific community in Southern California. The meeting also served as the official kick off for the newly formed Center for Epigenetics and Metabolism at the School of Medicine, UCI (http://cem.igb.uci.edu). PMID:22414797

  16. The 2014 update to the National Seismic Hazard Model in California

    Science.gov (United States)

    Powers, Peter; Field, Edward H.

    2015-01-01

    The 2014 update to the U. S. Geological Survey National Seismic Hazard Model in California introduces a new earthquake rate model and new ground motion models (GMMs) that give rise to numerous changes to seismic hazard throughout the state. The updated earthquake rate model is the third version of the Uniform California Earthquake Rupture Forecast (UCERF3), wherein the rates of all ruptures are determined via a self-consistent inverse methodology. This approach accommodates multifault ruptures and reduces the overprediction of moderate earthquake rates exhibited by the previous model (UCERF2). UCERF3 introduces new faults, changes to slip or moment rates on existing faults, and adaptively smoothed gridded seismicity source models, all of which contribute to significant changes in hazard. New GMMs increase ground motion near large strike-slip faults and reduce hazard over dip-slip faults. The addition of very large strike-slip ruptures and decreased reverse fault rupture rates in UCERF3 further enhances these effects.

  17. Identification of seismic precursors before large earthquakes: Decelerating and accelerating seismic patterns

    Science.gov (United States)

    Papadimitriou, Panayotis

    2008-04-01

    A useful way of understanding both seismotectonic processes and earthquake prediction research is to conceive seismic patterns as a function of space and time. The present work investigates seismic precursors before the occurrence of an earthquake. It does so by means of a methodology designed to study spatiotemporal characteristics of seismicity in a selected area. This methodology is based on two phenomena: the decelerating moment release (DMR) and the accelerating moment release (AMR), as they occur within a period ranging from several months to a few years before the oncoming event. The combination of these two seismic sequences leads to the proposed decelerating-accelerating moment release (DAMR) earthquake sequence, which appears as the last stage of loading in the earthquake cycle. This seismic activity appears as a foreshock sequence and can be supported by the stress accumulation model (SAM). The DAMR earthquake sequence constitutes a double seismic precursor identified in space and time before the occurrence of an earthquake and can be used to improve seismic hazard assessment research. In this study, the developed methodology is applied to the data of the 1989 Loma Prieta (California), the 1995 Kobe (Japan), and the 2003 Lefkada (Greece) earthquakes. The last part of this study focuses on the application of the methodology to the Ionian Sea (western Greece) and forecasts two earthquakes in that area.

  18. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  19. Fumigation success for California facility.

    Science.gov (United States)

    Hacker, Robert

    2010-02-01

    As Robert Hacker, at the time director of facilities management at the St John's Regional Medical Center in Oxnard, California, explains, the hospital, one of the area's largest, recently successfully utilised a new technology to eliminate mould, selecting a cost and time-saving fumigation process in place of the traditional "rip and tear" method. Although hospital managers knew the technology had been used extremely effectively in other US buildings, this was reportedly among the first ever healthcare applications.

  20. Stable Continental Region Earthquakes in South China

    Science.gov (United States)

    Liu, L.

    This paper reviews some remarkable characteristics of earthquakes in a Stable Continental Region (SCR) of the South China Block (SCB). The kernel of the SCB is the Yangtze platform solidified in late Proterozoic time, with continental growth to the southeast by a series of fold belts in Paleozoic time. The facts that the deviatoric stress is low, the orientations of the major tectonic features in the SCB are substantially normal to the maximum horizontal principal stress, and a relatively uniform crust, seem to be the major reasons for lack of significant seismicity in most regions of the SCB. Earthquakes in this region are mainly associated with three seismic zones: (1) the Southeast China Coast seismic zone related to Guangdong-Fujian coastal folding belt (associated with Eurasia-Philippine Sea plate collision); (2) the Southern Yellow Sea seismic zone associated with continental shelf rifts and basins; and (3) the Downstream Yangtze River seismic zone spatially coinciding with Tertiary rifts and basin development. All three seismic zones are close to one or two major economic and population centers in the SCB so that they pose significant seismic hazards. Earthquake focal mechanisms in the SCB are consistent with strike-slip to normal faulting stress regimes. Because of the global and national economic significance of the SCB and its dense population, the seismic hazard of the region is of outstanding importance. Comparing the SCB with another less developed region, a pending earthquake with the same size and tectonic setting would cause substantially more severe social and economic losses in the SCB. This paper also compiles an inventory of historic moderate to great earthquakes in the SCB; most of the data are not widely available in English literature.

  1. Earthquake in Haiti

    DEFF Research Database (Denmark)

    Holm, Isak Winkel

    2012-01-01

    In the vocabulary of modern disaster research, Heinrich von Kleist's seminal short story "The Earthquake in Chile" from 1806 is a tale of disaster vulnerability. The story is not just about a natural disaster destroying the innocent city of Santiago but also about the ensuing social disaster...

  2. Earthquake-proof plants

    International Nuclear Information System (INIS)

    Francescutti, P.

    2008-01-01

    In the wake of the damage suffered by the Kashiwazaki-Kariwa nuclear power plant as a result of an earthquake last July, this article looks at the seismic risk affecting the Spanish plants and the safety measures in place to prevent it. (Author)

  3. Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  4. Global Review of Induced and Triggered Earthquakes

    Science.gov (United States)

    Foulger, G. R.; Wilson, M.; Gluyas, J.; Julian, B. R.; Davies, R. J.

    2016-12-01

    ongoing at the time of the relevant earthquakes, e.g., oil production and brine injections. We present a first analysis of the data in our database and comment on some related issues with which scientists are currently grappling. Figure: Induced or triggered seismicity world-wide in Mollweide projection, centered on the Greenwich meridian.

  5. Summary of November 2010 meeting to evaluate turbidite data for constraining the recurrence parameters of great Cascadia earthquakes for the update of national seismic hazard maps

    Science.gov (United States)

    Frankel, Arthur D.

    2011-01-01

    This report summarizes a meeting of geologists, marine sedimentologists, geophysicists, and seismologists that was held on November 18–19, 2010 at Oregon State University in Corvallis, Oregon. The overall goal of the meeting was to evaluate observations of turbidite deposits to provide constraints on the recurrence time and rupture extent of great Cascadia subduction zone (CSZ) earthquakes for the next update of the U.S. national seismic hazard maps (NSHM). The meeting was convened at Oregon State University because this is the major center for collecting and evaluating turbidite evidence of great Cascadia earthquakes by Chris Goldfinger and his colleagues. We especially wanted the participants to see some of the numerous deep sea cores this group has collected that contain the turbidite deposits. Great earthquakes on the CSZ pose a major tsunami, ground-shaking, and ground-failure hazard to the Pacific Northwest. Figure 1 shows a map of the Pacific Northwest with a model for the rupture zone of a moment magnitude Mw 9.0 earthquake on the CSZ and the ground shaking intensity (in ShakeMap format) expected from such an earthquake, based on empirical ground-motion prediction equations. The damaging effects of such an earthquake would occur over a wide swath of the Pacific Northwest and an accompanying tsunami would likely cause devastation along the Pacifc Northwest coast and possibly cause damage and loss of life in other areas of the Pacific. A magnitude 8 earthquake on the CSZ would cause damaging ground shaking and ground failure over a substantial area and could also generate a destructive tsunami. The recent tragic occurrence of the 2011 Mw 9.0 Tohoku-Oki, Japan, earthquake highlights the importance of having accurate estimates of the recurrence times and magnitudes of great earthquakes on subduction zones. For the U.S. national seismic hazard maps, estimating the hazard from the Cascadia subduction zone has been based on coastal paleoseismic evidence of great

  6. Systematic Detection of Remotely Triggered Seismicity in Africa Following Recent Large Earthquakes

    Science.gov (United States)

    Ayorinde, A. O.; Peng, Z.; Yao, D.; Bansal, A. R.

    2016-12-01

    It is well known that large distant earthquakes can trigger micro-earthquakes/tectonic tremors during or immediately following their surface waves. Globally, triggered earthquakes have been mostly found in active plate boundary regions. It is not clear whether they could occur within stable intraplate regions in Africa as well as the active East African Rift Zone. In this study we conduct a systematic study of remote triggering in Africa following recent large earthquakes, including the 2004 Mw9.1 Sumatra and 2012 Mw8.6 Indian Ocean earthquakes. In particular, the 2012 Indian Ocean earthquake is the largest known strike slip earthquake and has triggered a global increase of magnitude larger than 5.5 earthquakes as well as numerous micro-earthquakes/tectonic tremors around the world. The entire Africa region was examined for possible remotely triggered seismicity using seismic data downloaded from the Incorporated Research Institutes for Seismology (IRIS) Data Management Center (DMC) and GFZ German Research Center for Geosciences. We apply a 5-Hz high-pass-filter to the continuous waveforms and visually identify high-frequency signals during and immediately after the large amplitude surface waves. Spectrograms are computed as additional tools to identify triggered seismicities and we further confirm them by statistical analysis comparing the high-frequency signals before and after the distant mainshocks. So far we have identified possible triggered seismicity in Botswana and northern Madagascar. This study could help to understand dynamic triggering in diverse tectonic settings of the African continent.

  7. Do I Really Sound Like That? Communicating Earthquake Science Following Significant Earthquakes at the NEIC

    Science.gov (United States)

    Hayes, G. P.; Earle, P. S.; Benz, H.; Wald, D. J.; Yeck, W. L.

    2017-12-01

    The U.S. Geological Survey's National Earthquake Information Center (NEIC) responds to about 160 magnitude 6.0 and larger earthquakes every year and is regularly inundated with information requests following earthquakes that cause significant impact. These requests often start within minutes after the shaking occurs and come from a wide user base including the general public, media, emergency managers, and government officials. Over the past several years, the NEIC's earthquake response has evolved its communications strategy to meet the changing needs of users and the evolving media landscape. The NEIC produces a cascade of products starting with basic hypocentral parameters and culminating with estimates of fatalities and economic loss. We speed the delivery of content by prepositioning and automatically generating products such as, aftershock plots, regional tectonic summaries, maps of historical seismicity, and event summary posters. Our goal is to have information immediately available so we can quickly address the response needs of a particular event or sequence. This information is distributed to hundreds of thousands of users through social media, email alerts, programmatic data feeds, and webpages. Many of our products are included in event summary posters that can be downloaded and printed for local display. After significant earthquakes, keeping up with direct inquiries and interview requests from TV, radio, and print reports is always challenging. The NEIC works with the USGS Office of Communications and the USGS Science Information Services to organize and respond to these requests. Written executive summaries reports are produced and distributed to USGS personnel and collaborators throughout the country. These reports are updated during the response to keep our message consistent and information up to date. This presentation will focus on communications during NEIC's rapid earthquake response but will also touch on the broader USGS traditional and

  8. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  9. Caspian Earthquake of November 25, 2000

    International Nuclear Information System (INIS)

    Arif, Hasanov; Behruz, Panahi

    2002-01-01

    Full text:Strong earthquake was felt in Absheron peninsula and surrounds in the Caspian Sea region at 18.09 GMT on November 25, 2000. According to alert information of Republic Center of Seismic Service of Azerbaijan Academy of Sciences the macro seismic epicenter of the main shock are located towards to north of Absheron peninsula. Preliminary information of Alert Service of Geophysical Service of Russian Academy of Sciences confirmed the mentioned location of epicenter and aftershocks sequence. The location of epicenter towards to north from Absheron peninsula also was determined according to data of Dagestan Experimental Methodic Expedition of Russian Academy of Sciences. Strong earthquake within western part of Caspian sea region was felt on November 25, 2000. Instrumental estimations of source depth of November 25, 2000 earthquake that carried out by different seismic services vary in a wide range from 10 to 97.4 km. Primary information about local mechanism of the easrtquake taken from CMT-Harvard Catalogue, Potsdam Center and AS GS RAS sources. Earthquake of November 25, 2000 occurred with maximum intensity within Absheron peninsula, and also within north-western and north-eastern coastal zones of Caspian Sea. In contradistinction to that the macro seismic intensity reached 3-4 degree of MSK within south-eastern coastal strip. This orientation can be explained by the fault system of NW-SE strike developed in the region along that the value of macro seismic intensity attenuation coefficient more greater in comparison with neighbor areas. This fact and known divergence between co-ordinates of macro seismic and instrumental epicenters gives right to situate the macro seismic epicenter northerner of Absheron peninsula

  10. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Science.gov (United States)

    Nelson, C. H.; Gutiérrez Pastor, J.; Goldfinger, C.; Escutia, C.

    2012-11-01

    We summarize the importance of great earthquakes (Mw ≳ 8) for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1) radiometric dating (14C method), and (2) relative dating, using hemipelagic sediment thickness and sedimentation rates (H method). The H method provides (1) the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2) the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia) or very close (San Andreas) to the early window for another great earthquake. On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs) are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km) than on passive margins (~1000 km). The great earthquakes along the Cascadia and northern California margins cause seismic strengthening of the sediment, which

  11. Great earthquakes along the Western United States continental margin: implications for hazards, stratigraphy and turbidite lithology

    Directory of Open Access Journals (Sweden)

    C. H. Nelson

    2012-11-01

    Full Text Available We summarize the importance of great earthquakes (Mw ≳ 8 for hazards, stratigraphy of basin floors, and turbidite lithology along the active tectonic continental margins of the Cascadia subduction zone and the northern San Andreas Transform Fault by utilizing studies of swath bathymetry visual core descriptions, grain size analysis, X-ray radiographs and physical properties. Recurrence times of Holocene turbidites as proxies for earthquakes on the Cascadia and northern California margins are analyzed using two methods: (1 radiometric dating (14C method, and (2 relative dating, using hemipelagic sediment thickness and sedimentation rates (H method. The H method provides (1 the best estimate of minimum recurrence times, which are the most important for seismic hazards risk analysis, and (2 the most complete dataset of recurrence times, which shows a normal distribution pattern for paleoseismic turbidite frequencies. We observe that, on these tectonically active continental margins, during the sea-level highstand of Holocene time, triggering of turbidity currents is controlled dominantly by earthquakes, and paleoseismic turbidites have an average recurrence time of ~550 yr in northern Cascadia Basin and ~200 yr along northern California margin. The minimum recurrence times for great earthquakes are approximately 300 yr for the Cascadia subduction zone and 130 yr for the northern San Andreas Fault, which indicates both fault systems are in (Cascadia or very close (San Andreas to the early window for another great earthquake.

    On active tectonic margins with great earthquakes, the volumes of mass transport deposits (MTDs are limited on basin floors along the margins. The maximum run-out distances of MTD sheets across abyssal-basin floors along active margins are an order of magnitude less (~100 km than on passive margins (~1000 km. The great earthquakes along the Cascadia and northern California margins

  12. The accommodation of relative motion at depth on the San Andreas fault system in California

    Science.gov (United States)

    Prescott, W. H.; Nur, A.

    1981-01-01

    Plate motion below the seismogenic layer along the San Andreas fault system in California is assumed to form by aseismic slip along a deeper extension of the fault or may result from lateral distribution of deformation below the seismogenic layer. The shallow depth of California earthquakes, the depth of the coseismic slip during the 1906 San Francisco earthquake, and the presence of widely separated parallel faults indicate that relative motion is distributed below the seismogenic zone, occurring by inelastic flow rather than by aseismic slip on discrete fault planes.

  13. Preliminary report: The Little Skull Mountain earthquake, June 29, 1992

    International Nuclear Information System (INIS)

    Anderson, J.G.; Brune, J.N.; Polo, D. de; Savage, M.K.; Sheehan, A.F.; Smith, K.D.; Gomberg, J.; Harmsen, S.C.

    1993-01-01

    The Little Skull Mountain earthquake occurred about 20 km from the potential high level nuclear repository at Yucca Mountain. The magnitude was 5.6, and the focal mechanism indicates normal faulting on a northeast trending structure. There is evidence that the earthquake was triggered by the magnitude M S = 7.5 earthquake in Landers, California, which occurred less than 24 hours earlier. Preliminary locations of the hypocenter and several aftershocks define an L shaped pattern near the southern boundary of the Nevada Test Site. One arm trends to the northeast beneath Little Skull Mountain, and a shorter, more diffuse zone trends to the southeast. The aftershocks are mostly located at depths between 7 km and 11 km, and may suggest a southeast dipping plane. There is no clear correlation with previously mapped surface faulting. The strongest recorded acceleration is about 0.21 g at Lathrop Wells, Nevada, 15 km from the epicenter. An extensive network of aftershock recorders was installed by the Seismological Laboratory, University of Nevada, Reno, by the US Geological Survey, Golden, Colorado, and by Lawrence Livermore Laboratory, Livermore, California. Aftershock experiments are ongoing as of November, 1992, and include experiments to improve location, depth, focal mechanism, and stress drop, study basin and ridge response near the epicenter and at Midway Valley, and study response of a tunnel at Little Skull Mountain. Analysis of this data, which includes thousands of aftershocks, has only begun

  14. Sensitive Wildlife - Center for Natural Lands Management [ds431

    Data.gov (United States)

    California Natural Resource Agency — This dataset represents sensitive wildlife data collected for the Center for Natural Lands Management (CNLM) at dedicated nature preserves in San Diego County,...

  15. Surface slip during large Owens Valley earthquakes

    KAUST Repository

    Haddon, E. K.

    2016-01-10

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from approximate to 1.0 to 6.0 m and average 3.31.1 m (2 sigma). Vertical offsets are predominantly east-down between approximate to 0.1 and 2.4 m, with a mean of 0.80.5 m. The average lateral-to-vertical ratio compiled at specific sites is approximate to 6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.41.5 m, corresponding to a geologic M-w approximate to 7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.12.0 m, 12.8 +/- 1.5 m, and 16.6 +/- 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between approximate to 0.6 and 1.6 mm/yr (1 sigma) over the late Quaternary.

  16. Turkish earthquakes reveal dynamics of fracturing along a major strike-slip fault zone

    Science.gov (United States)

    Çemen, Ibrahim; Gökten, Ergun; Varol, Baki; Kiliç, Recep; Özaksoy, Volkan; Erkmen, Cenk; Pinar, Ali

    During the last 5 months of 1999, northwestern Turkey experienced two major earthquakes along the North Anatolian Fault Zone (NAFZ). The first earthquake struck the country at 3:01 A.M. local time on August 17, and caused extensive damage in the towns of Yalova, Gölcük, Izmit, Adapazari, and Düzce (Figure 1). The second earthquake occurred at 6:57 P.M. local time on November 12 and caused damage mostly in Düzce and Kaynasli.The 7.4-Mw main shock of the August 17 Izmit earthquake was centered at 40.702°N, 29.987°E and originated at a depth of 17 km. The center was about 11 km southeast of Izmit, a major industrial town (Figure 1). The earthquake was a devastating natural disaster that claimed close to 20,000 lives and left more than 100,000 people homeless.

  17. Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network

    Science.gov (United States)

    Zulfikar, Can; Kariptas, Cagatay; Biyikoglu, Hikmet; Ozarpa, Cevat

    2017-04-01

    Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network Istanbul Natural Gas Distribution Corporation (IGDAS) is one of the end users of the Istanbul Earthquake Early Warning (EEW) signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867km of gas lines with 750 district regulators and 474,000 service boxes. The natural gas comes to Istanbul city borders with 70bar in 30inch diameter steel pipeline. The gas pressure is reduced to 20bar in RMS stations and distributed to district regulators inside the city. 110 of 750 district regulators are instrumented with strong motion accelerometers in order to cut gas flow during an earthquake event in the case of ground motion parameters exceeds the certain threshold levels. Also, state of-the-art protection systems automatically cut natural gas flow when breaks in the gas pipelines are detected. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 750 district regulator sites. IGDAS Real-time Earthquake Risk Reduction algorithm follows 4 stages as below: 1) Real-time ground motion data transmitted from 110 IGDAS and 110 KOERI (Kandilli Observatory and Earthquake Research Institute) acceleration stations to the IGDAS Scada Center and KOERI data center. 2) During an earthquake event EEW information is sent from IGDAS Scada Center to the IGDAS stations. 3) Automatic Shut-Off is applied at IGDAS district regulators, and calculated parameters are sent from stations to the IGDAS Scada Center and KOERI. 4) Integrated building and gas pipeline damage maps are prepared immediately after the earthquake event. The today's technology allows to rapidly estimate the