WorldWideScience

Sample records for term earthquake prediction

  1. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    Science.gov (United States)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same

  2. Stabilizing intermediate-term medium-range earthquake predictions

    International Nuclear Information System (INIS)

    Kossobokov, V.G.; Romashkova, L.L.; Panza, G.F.; Peresan, A.

    2001-12-01

    A new scheme for the application of the intermediate-term medium-range earthquake prediction algorithm M8 is proposed. The scheme accounts for the natural distribution of seismic activity, eliminates the subjectivity in the positioning of the areas of investigation and provides additional stability of the predictions with respect to the original variant. According to the retroactive testing in Italy and adjacent regions, this improvement is achieved without any significant change of the alarm volume in comparison with the results published so far. (author)

  3. Long-term predictability of regions and dates of strong earthquakes

    Science.gov (United States)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  4. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  5. Earthquake prediction the ory and its relation to precursors

    International Nuclear Information System (INIS)

    Negarestani, A.; Setayeshi, S.; Ghannadi-Maragheh, M.; Akasheh, B.

    2001-01-01

    Since we don't have enough knowledge about the Physics of earthquakes. therefore. the study of seismic precursors plays an important role in earthquake prediction. Earthquake prediction is a science which discusses about precursory phenomena during seismogenic process, and then investigates the correlation and association among them and the intrinsic relation between precursors and the seismogenic process. ar the end judges comprehensively the seismic status and finally makes earthquake prediction. There are two ways for predicting earthquake prediction. The first is to study the physics of seismogenic process and to determine the parameters in the process based on the source theories and the second way is to use seismic precursors. In this paper the theory of earthquake is reviewed. We also study theory of earthquake using models of earthquake origin, the relation between seismogenic process and various accompanying precursory phenomena. The earthquake prediction is divided into three categories: long-term, medium-term and short-term. We study seismic anomalous behavior. electric field, crustal deformation, gravity. magnetism of earth. change of groundwater variation. groundwater geochemistry and change of Radon gas emission. Finally, it is concluded the there is a correlation between Radon gas emission and earthquake phenomena. Meanwhile, there are some samples from actual processing in this area

  6. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  7. Statistical short-term earthquake prediction.

    Science.gov (United States)

    Kagan, Y Y; Knopoff, L

    1987-06-19

    A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

  8. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

    Science.gov (United States)

    Heki, K.; He, L.

    2017-12-01

    We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

  9. Quantitative Earthquake Prediction on Global and Regional Scales

    International Nuclear Information System (INIS)

    Kossobokov, Vladimir G.

    2006-01-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  10. Quantitative Earthquake Prediction on Global and Regional Scales

    Science.gov (United States)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  11. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  12. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  13. Short-term and long-term earthquake occurrence models for Italy: ETES, ERS and LTST

    Directory of Open Access Journals (Sweden)

    Maura Murru

    2010-11-01

    Full Text Available This study describes three earthquake occurrence models as applied to the whole Italian territory, to assess the occurrence probabilities of future (M ≥5.0 earthquakes: two as short-term (24 hour models, and one as long-term (5 and 10 years. The first model for short-term forecasts is a purely stochastic epidemic type earthquake sequence (ETES model. The second short-term model is an epidemic rate-state (ERS forecast based on a model that is physically constrained by the application to the earthquake clustering of the Dieterich rate-state constitutive law. The third forecast is based on a long-term stress transfer (LTST model that considers the perturbations of earthquake probability for interacting faults by static Coulomb stress changes. These models have been submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP for forecast testing for Italy (ETH-Zurich, and they were locked down to test their validity on real data in a future setting starting from August 1, 2009.

  14. VAN method of short-term earthquake prediction shows promise

    Science.gov (United States)

    Uyeda, Seiya

    Although optimism prevailed in the 1970s, the present consensus on earthquake prediction appears to be quite pessimistic. However, short-term prediction based on geoelectric potential monitoring has stood the test of time in Greece for more than a decade [VarotsosandKulhanek, 1993] Lighthill, 1996]. The method used is called the VAN method.The geoelectric potential changes constantly due to causes such as magnetotelluric effects, lightning, rainfall, leakage from manmade sources, and electrochemical instabilities of electrodes. All of this noise must be eliminated before preseismic signals are identified, if they exist at all. The VAN group apparently accomplished this task for the first time. They installed multiple short (100-200m) dipoles with different lengths in both north-south and east-west directions and long (1-10 km) dipoles in appropriate orientations at their stations (one of their mega-stations, Ioannina, for example, now has 137 dipoles in operation) and found that practically all of the noise could be eliminated by applying a set of criteria to the data.

  15. Intermediate-term earthquake prediction and seismic zoning in Northern Italy

    International Nuclear Information System (INIS)

    Panza, G.F.; Orozova Stanishkova, I.; Costa, G.; Vaccari, F.

    1993-12-01

    The algorithm CN for intermediate earthquake prediction has been applied to an area in Northern Italy, which has been chosen according to a recently proposed seismotectonic model. Earthquakes with magnitude ≥ 5.4 occur in the area with a relevant frequency and their occurrence is predicted by algorithm CN. Therefore a seismic hazard analysis has been performed using a deterministic procedure, based on the computation of complete synthetic seismograms. The results are summarized in a map giving the distribution of peak ground acceleration, but the complete time series are available, which can be used by civil engineers in the design of new seismo-resistant constructions and in the retrofitting of the existing ones. This risk reduction action should be intensified in connection with warnings issued on the basis of the forward predictions made by CN. (author). Refs, 7 figs, 1 tab

  16. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  17. The USGS plan for short-term prediction of the anticipated Parkfield earthquake

    Science.gov (United States)

    Bakun, W.H.

    1988-01-01

    Aside from the goal of better understanding the Parkfield earthquake cycle, it is the intention of the U.S Geological Survey to attempt to issue a warning shortly before the anticipated earthquake. Although short-term earthquake warnings are not yet generally feasible, the wealth of information available for the previous significant Parkfield earthquakes suggests that if the next earthquake follows the pattern of "characteristic" Parkfield shocks, such a warning might be possible. Focusing on earthquake precursors reported for the previous  "characteristic" shocks, particulary the 1934 and 1966 events, the USGS developed a plan* in late 1985 on which to base earthquake warnings for Parkfield and has assisted State, county, and local officials in the Parkfield area to prepare a coordinated, reasonable response to a warning, should one be issued. 

  18. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  19. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  20. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  1. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  2. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults.

    Science.gov (United States)

    McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H

    2005-03-24

    East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion.

  3. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  4. Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  5. Stigma in science: the case of earthquake prediction.

    Science.gov (United States)

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  6. Intermediate-term medium-range earthquake prediction algorithm M8: A new spatially stabilized application in Italy

    International Nuclear Information System (INIS)

    Romashkova, L.L.; Kossobokov, V.G.; Peresan, A.; Panza, G.F.

    2001-12-01

    A series of experiments, based on the intermediate-term earthquake prediction algorithm M8, has been performed for the retrospective simulation of forward predictions in the Italian territory, with the aim to design an experimental routine for real-time predictions. These experiments evidenced two main difficulties for the application of M8 in Italy. The first one is due to the fact that regional catalogues are usually limited in space. The second one concerns certain arbitrariness and instability, with respect to the positioning of the circles of investigation. Here we design a new scheme for the application of the algorithm M8, which is less subjective and less sensitive to the position of the circles of investigation. To perform this test, we consider a recent revision of the Italian catalogue, named UCI2001, composed by CCI1996, NEIC and ALPOR data for the period 1900-1985, and updated with the NEIC reduces the spatial heterogeneity of the data at the boundaries of Italy. The new variant of the M8 algorithm application reduces the number of spurious alarms and increases the reliability of predictions. As a result, three out of four earthquakes with magnitude M max larger than 6.0 are predicted in the retrospective simulation of the forward prediction, during the period 1972-2001, with a space-time volume of alarms comparable to that obtained with the non-stabilized variant of the M8 algorithm in Italy. (author)

  7. Can Vrancea earthquakes be accurately predicted from unusual bio-system behavior and seismic-electromagnetic records?

    International Nuclear Information System (INIS)

    Enescu, D.; Chitaru, C.; Enescu, B.D.

    1999-01-01

    The relevance of bio-seismic research for the short-term prediction of strong Vrancea earthquakes is underscored. An unusual animal behavior before and during Vrancea earthquakes is described and illustrated in the individual case of the major earthquake of March 4, 1977. Several hypotheses to account for the uncommon behavior of bio-systems in relation to earthquakes in general and strong Vrancea earthquakes in particular are discussed in the second section. It is reminded that promising preliminary results concerning the identification of seismic-electromagnetic precursor signals have been obtained in the Vrancea seismogenic area using special, highly sensitive equipment. The need to correlate bio-seismic and seismic-electromagnetic researches is evident. Further investigations are suggested and urgent steps are proposed in order to achieve a successful short-term prediction of strong Vrancea earthquakes. (authors)

  8. Testing for the 'predictability' of dynamically triggered earthquakes in The Geysers geothermal field

    Science.gov (United States)

    Aiken, Chastity; Meng, Xiaofeng; Hardebeck, Jeanne

    2018-03-01

    The Geysers geothermal field is well known for being susceptible to dynamic triggering of earthquakes by large distant earthquakes, owing to the introduction of fluids for energy production. Yet, it is unknown if dynamic triggering of earthquakes is 'predictable' or whether dynamic triggering could lead to a potential hazard for energy production. In this paper, our goal is to investigate the characteristics of triggering and the physical conditions that promote triggering to determine whether or not triggering is in anyway foreseeable. We find that, at present, triggering in The Geysers is not easily 'predictable' in terms of when and where based on observable physical conditions. However, triggered earthquake magnitude positively correlates with peak imparted dynamic stress, and larger dynamic stresses tend to trigger sequences similar to mainshock-aftershock sequences. Thus, we may be able to 'predict' what size earthquakes to expect at The Geysers following a large distant earthquake.

  9. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  10. Three Millennia of Seemingly Time-Predictable Earthquakes, Tell Ateret

    Science.gov (United States)

    Agnon, Amotz; Marco, Shmuel; Ellenblum, Ronnie

    2014-05-01

    Among various idealized recurrence models of large earthquakes, the "time-predictable" model has a straightforward mechanical interpretation, consistent with simple friction laws. On a time-predictable fault, the time interval between an earthquake and its predecessor is proportional to the slip during the predecessor. The alternative "slip-predictable" model states that the slip during earthquake rupture is proportional to the preceding time interval. Verifying these models requires extended records of high precision data for both timing and amount of slip. The precision of paleoearthquake data can rarely confirm or rule out predictability, and recent papers argue for either time- or slip-predictable behavior. The Ateret site, on the trace of the Dead Sea fault at the Jordan Gorge segment, offers unique precision for determining space-time patterns. Five consecutive slip events, each associated with deformed and offset sets of walls, are correlated with historical earthquakes. Two correlations are based on detailed archaeological, historical, and numismatic evidence. The other three are tentative. The offsets of three of the events are determined with high precision; the other two are not as certain. Accepting all five correlations, the fault exhibits a striking time-predictable behavior, with a long term slip rate of 3 mm/yr. However, the 30 October 1759 ~0.5 m rupture predicts a subsequent rupture along the Jordan Gorge toward the end of the last century. We speculate that earthquakres on secondary faults (the 25 November 1759 on the Rachaya branch and the 1 January 1837 on the Roum branch, both M≥7) have disrupted the 3 kyr time-predictable pattern.

  11. Is It Possible to Predict Strong Earthquakes?

    Science.gov (United States)

    Polyakov, Y. S.; Ryabinin, G. V.; Solovyeva, A. B.; Timashev, S. F.

    2015-07-01

    The possibility of earthquake prediction is one of the key open questions in modern geophysics. We propose an approach based on the analysis of common short-term candidate precursors (2 weeks to 3 months prior to strong earthquake) with the subsequent processing of brain activity signals generated in specific types of rats (kept in laboratory settings) who reportedly sense an impending earthquake a few days prior to the event. We illustrate the identification of short-term precursors using the groundwater sodium-ion concentration data in the time frame from 2010 to 2014 (a major earthquake occurred on 28 February 2013) recorded at two different sites in the southeastern part of the Kamchatka Peninsula, Russia. The candidate precursors are observed as synchronized peaks in the nonstationarity factors, introduced within the flicker-noise spectroscopy framework for signal processing, for the high-frequency component of both time series. These peaks correspond to the local reorganizations of the underlying geophysical system that are believed to precede strong earthquakes. The rodent brain activity signals are selected as potential "immediate" (up to 2 weeks) deterministic precursors because of the recent scientific reports confirming that rodents sense imminent earthquakes and the population-genetic model of K irshvink (Soc Am 90, 312-323, 2000) showing how a reliable genetic seismic escape response system may have developed over the period of several hundred million years in certain animals. The use of brain activity signals, such as electroencephalograms, in contrast to conventional abnormal animal behavior observations, enables one to apply the standard "input-sensor-response" approach to determine what input signals trigger specific seismic escape brain activity responses.

  12. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  13. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  14. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey National Earthquake Prediction Evaluation... 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\-day meeting.... Geological Survey on proposed earthquake predictions, on the completeness and scientific validity of the...

  15. 76 FR 19123 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-04-06

    ... Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey, Interior. ACTION: Notice of meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  16. The October 1992 Parkfield, California, earthquake prediction

    Science.gov (United States)

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  17. 78 FR 64973 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2013-10-30

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] National Earthquake Prediction...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a... Council shall advise the Director of the U.S. Geological Survey on proposed earthquake predictions, on the...

  18. CN earthquake prediction algorithm and the monitoring of the future strong Vrancea events

    International Nuclear Information System (INIS)

    Moldoveanu, C.L.; Radulian, M.; Novikova, O.V.; Panza, G.F.

    2002-01-01

    The strong earthquakes originating at intermediate-depth in the Vrancea region (located in the SE corner of the highly bent Carpathian arc) represent one of the most important natural disasters able to induce heavy effects (high tool of casualties and extensive damage) in the Romanian territory. The occurrence of these earthquakes is irregular, but not infrequent. Their effects are felt over a large territory, from Central Europe to Moscow and from Greece to Scandinavia. The largest cultural and economical center exposed to the seismic risk due to the Vrancea earthquakes is Bucharest. This metropolitan area (230 km 2 wide) is characterized by the presence of 2.5 million inhabitants (10% of the country population) and by a considerable number of high-risk structures and infrastructures. The best way to face strong earthquakes is to mitigate the seismic risk by using the two possible complementary approaches represented by (a) the antiseismic design of structures and infrastructures (able to support strong earthquakes without significant damage), and (b) the strong earthquake prediction (in terms of alarm intervals declared for long, intermediate or short-term space-and time-windows). The intermediate term medium-range earthquake prediction represents the most realistic target to be reached at the present state of knowledge. The alarm declared in this case extends over a time window of about one year or more, and a space window of a few hundreds of kilometers. In the case of Vrancea events the spatial uncertainty is much less, being of about 100 km. The main measures for the mitigation of the seismic risk allowed by the intermediate-term medium-range prediction are: (a) verification of the buildings and infrastructures stability and reinforcement measures when required, (b) elaboration of emergency plans of action, (c) schedule of the main actions required in order to restore the normality of the social and economical life after the earthquake. The paper presents the

  19. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  20. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2012-08-31

    ... DEPARTMENT OF THE INTERIOR Geological Survey [USGS-GX12GG00995NP00] National Earthquake Prediction... meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... National Earthquake Information Center (NEIC), 1711 Illinois Avenue, Golden, Colorado 80401. The Council is...

  1. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  2. Roles of Radon-222 and other natural radionuclides in earthquake prediction

    International Nuclear Information System (INIS)

    Smith, A.R.; Wollenberg, H.A.; Mosier, D.F.

    1980-01-01

    The concentration of 222 Rn in subsurface waters is one of the natural parameters being investigated to help develop the capability to predict destructive earthquakes. Since 1966, scientists in several nations have sought to link radon variations with ongoing seismic activity, primarily through the dilatancy model for earthquake occurrences. Within the range of these studies, alpha-, beta-, and gamma-radiation detection techniques have been used in both discrete-sampling and continiuous-monitoring programs. These measured techniques are reviewed in terms of instrumentation adapted to seismic-monitoring purposes. A recent Lawrence Berkeley Laboratory study conducted in central California incorporated discrete sampling of wells in the aftershock area of the 1975 Oroville earthquake and continuous monitoring of water radon in a well on the San Andreas Fault. The results presented show short-term radon variations that may be associated with aftershocks and diurnal changes that may reflect earth tidal forces

  3. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  4. 75 FR 63854 - National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee

    Science.gov (United States)

    2010-10-18

    ... DEPARTMENT OF THE INTERIOR Geological Survey National Earthquake Prediction Evaluation Council...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 2... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  5. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  6. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    Science.gov (United States)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  7. The ordered network structure and prediction summary for M ≥ 7 earthquakes in Xinjiang region of China

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai

    2014-01-01

    M ≥ 7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a x k (k = 1, 2, 3), 11 ∝ 12 a, 41 ∝ 43 a, 18 ∝ 19 a, and 5 ∝ 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019-2020 and 2025-2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

  8. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  9. Load-Unload Response Ratio and Accelerating Moment/Energy Release Critical Region Scaling and Earthquake Prediction

    Science.gov (United States)

    Yin, X. C.; Mora, P.; Peng, K.; Wang, Y. C.; Weatherley, D.

    The main idea of the Load-Unload Response Ratio (LURR) is that when a system is stable, its response to loading corresponds to its response to unloading, whereas when the system is approaching an unstable state, the response to loading and unloading becomes quite different. High LURR values and observations of Accelerating Moment/Energy Release (AMR/AER) prior to large earthquakes have led different research groups to suggest intermediate-term earthquake prediction is possible and imply that the LURR and AMR/AER observations may have a similar physical origin. To study this possibility, we conducted a retrospective examination of several Australian and Chinese earthquakes with magnitudes ranging from 5.0 to 7.9, including Australia's deadly Newcastle earthquake and the devastating Tangshan earthquake. Both LURR values and best-fit power-law time-to-failure functions were computed using data within a range of distances from the epicenter. Like the best-fit power-law fits in AMR/AER, the LURR value was optimal using data within a certain epicentral distance implying a critical region for LURR. Furthermore, LURR critical region size scales with mainshock magnitude and is similar to the AMR/AER critical region size. These results suggest a common physical origin for both the AMR/AER and LURR observations. Further research may provide clues that yield an understanding of this mechanism and help lead to a solid foundation for intermediate-term earthquake prediction.

  10. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  11. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  12. Predicting Posttraumatic Stress Symptom Prevalence and Local Distribution after an Earthquake with Scarce Data.

    Science.gov (United States)

    Dussaillant, Francisca; Apablaza, Mauricio

    2017-08-01

    After a major earthquake, the assignment of scarce mental health emergency personnel to different geographic areas is crucial to the effective management of the crisis. The scarce information that is available in the aftermath of a disaster may be valuable in helping predict where are the populations that are in most need. The objectives of this study were to derive algorithms to predict posttraumatic stress (PTS) symptom prevalence and local distribution after an earthquake and to test whether there are algorithms that require few input data and are still reasonably predictive. A rich database of PTS symptoms, informed after Chile's 2010 earthquake and tsunami, was used. Several model specifications for the mean and centiles of the distribution of PTS symptoms, together with posttraumatic stress disorder (PTSD) prevalence, were estimated via linear and quantile regressions. The models varied in the set of covariates included. Adjusted R2 for the most liberal specifications (in terms of numbers of covariates included) ranged from 0.62 to 0.74, depending on the outcome. When only including peak ground acceleration (PGA), poverty rate, and household damage in linear and quadratic form, predictive capacity was still good (adjusted R2 from 0.59 to 0.67 were obtained). Information about local poverty, household damage, and PGA can be used as an aid to predict PTS symptom prevalence and local distribution after an earthquake. This can be of help to improve the assignment of mental health personnel to the affected localities. Dussaillant F , Apablaza M . Predicting posttraumatic stress symptom prevalence and local distribution after an earthquake with scarce data. Prehosp Disaster Med. 2017;32(4):357-367.

  13. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  14. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  15. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    Science.gov (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  16. Some considerations regarding earthquake prediction - The case of Vrancea region -

    International Nuclear Information System (INIS)

    Enescu, Bogdan; Enescu, Dumitru

    2000-01-01

    Earthquake prediction research has been conducted for over 100 years with no obvious success. In the last year, the new modern concepts regarding the earthquake dynamics added another source of skepticism regarding the possibility of predicting earthquakes. However there are some recognizable trends, optimized in the recent years, which may give rise to more reliable and solid approaches to deal with this complex subject. In the light of these trends, emphasized by Aki, we try to analyze the new developments in the field, especially concerning the Vrancea region. (authors)

  17. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  18. Prediction of earthquakes: a data evaluation and exchange problem

    Energy Technology Data Exchange (ETDEWEB)

    Melchior, Paul

    1978-11-15

    Recent experiences in earthquake prediction are recalled. Precursor information seems to be available from geodetic measurements, hydrological and geochemical measurements, electric and magnetic measurements, purely seismic phenomena, and zoological phenomena; some new methods are proposed. A list of possible earthquake triggers is given. The dilatancy model is contrasted with a dry model; they seem to be equally successful. In conclusion, the space and time range of the precursors is discussed in relation to the magnitude of earthquakes. (RWR)

  19. Applications of the gambling score in evaluating earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  20. Signals of ENPEMF Used in Earthquake Prediction

    Science.gov (United States)

    Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

    2012-12-01

    The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

  1. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  2. Monitoring of the future strong Vrancea events by using the CN formal earthquake prediction algorithm

    International Nuclear Information System (INIS)

    Moldoveanu, C.L.; Novikova, O.V.; Panza, G.F.; Radulian, M.

    2003-06-01

    The preparation process of the strong subcrustal events originating in Vrancea region, Romania, is monitored using an intermediate-term medium-range earthquake prediction method - the CN algorithm (Keilis-Borok and Rotwain, 1990). We present the results of the monitoring of the preparation of future strong earthquakes for the time interval from January 1, 1994 (1994.1.1), to January 1, 2003 (2003.1.1) using the updated catalogue of the Romanian local network. The database considered for the CN monitoring of the preparation of future strong earthquakes in Vrancea covers the period from 1966.3.1 to 2003.1.1 and the geographical rectangle 44.8 deg - 48.4 deg N, 25.0 deg - 28.0 deg E. The algorithm correctly identifies, by retrospective prediction, the TJPs for all the three strong earthquakes (Mo=6.4) that occurred in Vrancea during this period. The cumulated duration of the TIPs represents 26.5% of the total period of time considered (1966.3.1-2003.1.1). The monitoring of current seismicity using the algorithm CN has been carried out since 1994. No strong earthquakes occurred from 1994.1.1 to 2003.1.1 but the CN declared an extended false alarm from 1999.5.1 to 2000.11.1. No alarm has currently been declared in the region (on January 1, 2003), as can be seen from the TJPs diagram shown. (author)

  3. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  4. Intermediate-term middle-range predictions in Italy: a review

    International Nuclear Information System (INIS)

    Peresan, A.; Kossobokov, V.; Romashkova, L.; Panza, G.F.

    2003-11-01

    The Italian territory has been object of several studies devoted to the analysis of seismicity and to earthquake precursors' research. Although a number of observations have been claimed to precede large earthquakes, only few systematic studies have been carried out and almost no test of their performances is available up to now. In this paper we review the application to the Italian territory of two formally defined intermediate-term middle-range earthquake prediction algorithms, namely CN and M8S. The general methodology common to the two different algorithms makes use of general concepts of pattern recognition that permit to deal with multiple sets of seismic precursors, and allows for a systematic monitoring of seismicity, as well as for a widespread testing of the prediction performances. Italy represents the only region of moderate seismic activity where the M8S and CN algorithms are applied simultaneously for the routine monitoring. Significant efforts have been made to minimize the intrinsic space uncertainty of predictions and the subjectivity of the definition of the areas where precursors should be identified. Several experiments have been dedicated to assess the robustness of the methodology against the unavoidable uncertainties in the data. With these results acquired, predictions are routinely issued by CN algorithm, since January 1998, and by M8S algorithm, since January 2002. Starting in July 2003 an experiment has been launched for the real-time test of M8S and CN predictions. (author)

  5. Implications of fault constitutive properties for earthquake prediction.

    Science.gov (United States)

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  6. Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

    KAUST Repository

    Sawlan, Zaid A

    2012-12-01

    Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent. Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters. This method reduces the error produced by uncertain inputs. In addition, state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.

  7. Time-predictable model applicability for earthquake occurrence in northeast India and vicinity

    Directory of Open Access Journals (Sweden)

    A. Panthi

    2011-03-01

    Full Text Available Northeast India and its vicinity is one of the seismically most active regions in the world, where a few large and several moderate earthquakes have occurred in the past. In this study the region of northeast India has been considered for an earthquake generation model using earthquake data as reported by earthquake catalogues National Geophysical Data Centre, National Earthquake Information Centre, United States Geological Survey and from book prepared by Gupta et al. (1986 for the period 1906–2008. The events having a surface wave magnitude of Ms≥5.5 were considered for statistical analysis. In this region, nineteen seismogenic sources were identified by the observation of clustering of earthquakes. It is observed that the time interval between the two consecutive mainshocks depends upon the preceding mainshock magnitude (Mp and not on the following mainshock (Mf. This result corroborates the validity of time-predictable model in northeast India and its adjoining regions. A linear relation between the logarithm of repeat time (T of two consecutive events and the magnitude of the preceding mainshock is established in the form LogT = cMp+a, where "c" is a positive slope of line and "a" is function of minimum magnitude of the earthquake considered. The values of the parameters "c" and "a" are estimated to be 0.21 and 0.35 in northeast India and its adjoining regions. The less value of c than the average implies that the earthquake occurrence in this region is different from those of plate boundaries. The result derived can be used for long term seismic hazard estimation in the delineated seismogenic regions.

  8. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    Science.gov (United States)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  9. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    2Institute of Earthquake Prediction Theory and Mathematical Geophysics, ... located about 70 km from a preceding M7.3 earthquake that occurred in ... local extremes of the seismic density distribution, and in the third approach, CI centers were distributed ...... Bird P 2003 An updated digital model of plate boundaries;.

  10. Predicting Dynamic Response of Structures under Earthquake Loads Using Logical Analysis of Data

    Directory of Open Access Journals (Sweden)

    Ayman Abd-Elhamed

    2018-04-01

    Full Text Available In this paper, logical analysis of data (LAD is used to predict the seismic response of building structures employing the captured dynamic responses. In order to prepare the data, computational simulations using a single degree of freedom (SDOF building model under different ground motion records are carried out. The selected excitation records are real and of different peak ground accelerations (PGA. The sensitivity of the seismic response in terms of displacements of floors to the variation in earthquake characteristics, such as soil class, characteristic period, and time step of records, peak ground displacement, and peak ground velocity, have also been considered. The dynamic equation of motion describing the building model and the applied earthquake load are presented and solved incrementally using the Runge-Kutta method. LAD then finds the characteristic patterns which lead to forecast the seismic response of building structures. The accuracy of LAD is compared to that of an artificial neural network (ANN, since the latter is the most known machine learning technique. Based on the conducted study, the proposed LAD model has been proven to be an efficient technique to learn, simulate, and blindly predict the dynamic response behaviour of building structures subjected to earthquake loads.

  11. Failures and suggestions in Earthquake forecasting and prediction

    Science.gov (United States)

    Sacks, S. I.

    2013-12-01

    Seismologists have had poor success in earthquake prediction. However, wide ranging observations from earlier great earthquakes show that precursory data can exist. In particular, two aspects seem promising. In agreement with simple physical modeling, b-values decrease in highly loaded fault zones for years before failure. Potentially more usefully, in high stress regions, breakdown of dilatant patches leading to failure can yield expelled water-related observations. The volume increase (dilatancy) caused by high shear stresses decreases the pore pressure. Eventually, water flows back in restoring the pore pressure, promoting failure and expelling the extra water. Of course, in a generally stressed region there may be many small patches that fail, such as observed before the 1975 Haicheng earthquake. Only a few days before the major event will most of the dilatancy breakdown occur in the fault zone itself such as for the Tangshan, 1976 destructive event. Observations of 'water release' effects have been observed before the 1923 great Kanto earthquake, the 1984 Yamasaki event, the 1975 Haicheng and the 1976 Tangshan earthquakes and also the 1995 Kobe earthquake. While there are obvious difficulties in water release observations, not least because there is currently no observational network anywhere, historical data does suggest some promise if we broaden our approach to this difficult subject.

  12. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  13. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    Science.gov (United States)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  14. VLF/LF Radio Sounding of Ionospheric Perturbations Associated with Earthquakes

    Directory of Open Access Journals (Sweden)

    Masashi Hayakawa

    2007-07-01

    Full Text Available It is recently recognized that the ionosphere is very sensitive to seismic effects,and the detection of ionospheric perturbations associated with earthquakes, seems to bevery promising for short-term earthquake prediction. We have proposed a possible use ofVLF/LF (very low frequency (3-30 kHz /low frequency (30-300 kHz radio sounding ofthe seismo-ionospheric perturbations. A brief history of the use of subionospheric VLF/LFpropagation for the short-term earthquake prediction is given, followed by a significantfinding of ionospheric perturbation for the Kobe earthquake in 1995. After showingprevious VLF/LF results, we present the latest VLF/LF findings; One is the statisticalcorrelation of the ionospheric perturbation with earthquakes and the second is a case studyfor the Sumatra earthquake in December, 2004, indicating the spatical scale and dynamicsof ionospheric perturbation for this earthquake.

  15. A forecast experiment of earthquake activity in Japan under Collaboratory for the Study of Earthquake Predictability (CSEP)

    Science.gov (United States)

    Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.

    2012-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast

  16. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    Science.gov (United States)

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  17. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  18. Long-term impact of earthquakes on sleep quality.

    Science.gov (United States)

    Tempesta, Daniela; Curcio, Giuseppe; De Gennaro, Luigi; Ferrara, Michele

    2013-01-01

    We investigated the impact of the 6.3 magnitude 2009 L'Aquila (Italy) earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI) and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A) two years after the natural disaster. Self-reported sleep quality was assessed in 665 L'Aquila citizens exposed to the earthquake compared with a different sample (n = 754) of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB) of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574). The comparison between L'Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder.

  19. A mathematical model for predicting earthquake occurrence ...

    African Journals Online (AJOL)

    We consider the continental crust under damage. We use the observed results of microseism in many seismic stations of the world which was established to study the time series of the activities of the continental crust with a view to predicting possible time of occurrence of earthquake. We consider microseism time series ...

  20. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    NARCIS (Netherlands)

    Cheong, S.A.; Tan, T.L.; Chen, C.-C.; Chang, W.-L.; Liu, Z.; Chew, L.Y.; Sloot, P.M.A.; Johnson, N.F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting

  1. EPOS1 - a multiparameter measuring system to earthquake prediction research

    Energy Technology Data Exchange (ETDEWEB)

    Streil, T.; Oeser, V. [SARAD GmbH, Dresden (Germany); Heinicke, J.; Koch, U.; Wiegand, J.

    1998-12-31

    The approach to earthquake prediction by geophysical, geochemical and hydrological measurements is a long and winding road. Nevertheless, the results show a progress in that field (e.g. Kobe). This progress is also a result of a new generation of measuring equipment. SARAD has developed a versatile measuring system (EPOS1) based on experiences and recent results from different research groups. It is able to record selected parameters suitable to earthquake prediction research. A micro-computer system handles data exchange, data management and control. It is connected to a modular sensor system. Sensor modules can be selected according to the actual needs at the measuring site. (author)

  2. Fault Branching and Long-Term Earthquake Rupture Scenario for Strike-Slip Earthquake

    Science.gov (United States)

    Klinger, Y.; CHOI, J. H.; Vallage, A.

    2017-12-01

    Careful examination of surface rupture for large continental strike-slip earthquakes reveals that for the majority of earthquakes, at least one major branch is involved in the rupture pattern. Often, branching might be either related to the location of the epicenter or located toward the end of the rupture, and possibly related to the stopping of the rupture. In this work, we examine large continental earthquakes that show significant branches at different scales and for which ground surface rupture has been mapped in great details. In each case, rupture conditions are described, including dynamic parameters, past earthquakes history, and regional stress orientation, to see if the dynamic stress field would a priori favor branching. In one case we show that rupture propagation and branching are directly impacted by preexisting geological structures. These structures serve as pathways for the rupture attempting to propagate out of its shear plane. At larger scale, we show that in some cases, rupturing a branch might be systematic, hampering possibilities for the development of a larger seismic rupture. Long-term geomorphology hints at the existence of a strong asperity in the zone where the rupture branched off the main fault. There, no evidence of throughgoing rupture could be seen along the main fault, while the branch is well connected to the main fault. This set of observations suggests that for specific configurations, some rupture scenarios involving systematic branching are more likely than others.

  3. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  4. WHY WE CANNOT PREDICT STRONG EARTHQUAKES IN THE EARTH’S CRUST

    Directory of Open Access Journals (Sweden)

    Iosif L. Gufeld

    2011-01-01

    needed to address the issues raised in this publication, including problems and possibilities of prediction of earthquakes in the crust. Incontrovertible achievements of the Earth sciences are reviewed, considering specific features of seismic events and variations of various parameters of the lithosphere, the block structure of the lithosphere and processes in the lithosphere. Much attention is given to analyses of driving forces of the seismotectonic process. The studies of variations of parameters of the medium, including rapid (hourly or daily changes, show that processes, that predetermine the state of stresses or the energy capacity of the medium (Figures 2 and 3 in the lithosphere, are overlooked. Analyses are based on processes of interactions between ascending flows of hydrogen and helium and the solid lithosphere. A consequence of such processes is gas porosity that controls many parameters of the medium and the oscillation regime of the threedimensional state of stresses of the block structures (Figures 6, 7, and 12, which impacts the dynamics of block movements. The endogenous activity of the lithosphere and its instability are controlled by degassing of light gases.The paper reviews processes of preparation for strong earthquakes in the crust with regard to the block structure of platform areas and subduction zones (Figures 13 and 14. It is demonstrated that the conventional methods yield ambiguous assessments of seismic hazard both in terms of time and locations of epicenter zones, and focal areas of subduction zones are out of control in principle. Processes that actually take place in the lithosphere are causes of such an ambiguity, i.e. the lack of any deterministic relations in development of critical seismotectonic situations. Methods for identification of the geological medium characterized by continuously variable parameters are considered. Directions of fundamental studies of the seismic process and principles of seismic activity monitoring are

  5. Real-time numerical shake prediction and updating for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan

    2017-12-01

    Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.

  6. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  7. The use of radon gas techniques for earthquake prediction

    International Nuclear Information System (INIS)

    Al-Hilal, M.

    1993-01-01

    This scientific article explains the applications of radon gas measurements in water and soil for monitoring fault activities and earthquake prediction. It also emphasizes, through some worldwide examples presented from Tashkent Basin in U.S.S.R. and from San Andreas fault in U.S.A, that the use of radon gas technique in fault originated water as well as in soil gases can be considered as an important geological-tool, within the general framework of earthquake prediction because of the coherent and time anomalous relationship between the density of alpha particles due to radon decay and between the tectonic activity level along fault zones. The article also indicates, and through the practical experience of the author, to the possibility of applying such techniques in certain parts of Syria. (author). 6 refs., 4 figs

  8. Application of geochemical methods in earthquake prediction in China

    Energy Technology Data Exchange (ETDEWEB)

    Fong-liang, J.; Gui-ru, L.

    1981-05-01

    Several geochemical anomalies were observed before the Haichen, Longling, Tangshan, and Songpan earthquakes and their strong aftershocks. They included changes in groundwater radon levels; chemical composition of the groundwater (concentration of Ca/sup + +/, Mg/sup + +/, Cl/sup -/, So/sub 4//sup , and HCO/sub 3//sup -/ ions); conductivity; and dissolved gases such as H/sub 2/, CO/sub 2/, etc. In addition, anomalous changes in water color and quality were observed before these large earthquakes. Before some events gases escaped from the surface, and there were reports of ''ground odors'' being smelled by local residents. The large amount of radon data can be grouped into long-term and short-term anomalies. The long-term anomalies have a radon emission build up time of from a few months to more than a year. The short-term anomalies have durations from a few hours or less to a few months.

  9. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  10. Gambling scores for earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  11. A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta

    Science.gov (United States)

    Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.

    2015-12-01

    Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.

  12. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.

    1999-01-01

    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  13. The seismic cycles of large Romanian earthquake: The physical foundation, and the next large earthquake in Vrancea

    International Nuclear Information System (INIS)

    Purcaru, G.

    2002-01-01

    The occurrence patterns of large/great earthquakes at subduction zone interface and in-slab are complex in the space-time dynamics, and make even long-term forecasts very difficult. For some favourable cases where a predictive (empirical) law was found successful predictions were possible (eg. Aleutians, Kuriles, etc). For the large Romanian events (M > 6.7), occurring in the Vrancea seismic slab below 60 km, Purcaru (1974) first found the law of the occurrence time and magnitude: the law of 'quasicycles' and 'supercycles', for large and largest events (M > 7.25), respectively. The quantitative model of Purcaru with these seismic cycles has three time-bands (periods of large earthquakes)/century, discovered using the earthquake history (1100-1973) (however incomplete) of large Vrancea earthquakes for which M was initially estimated (Purcaru, 1974, 1979). Our long-term prediction model is essentially quasideterministic, it predicts uniquely the time and magnitude; since is not strict deterministic the forecasting is interval valued. It predicted the next large earthquake in 1980 in the 3rd time-band (1970-1990), and which occurred in 1977 (M7.1, M w 7.5). The prediction was successful, in long-term sense. We discuss the unpredicted events in 1986 and 1990. Since the laws are phenomenological, we give their physical foundation based on the large scale of rupture zone (RZ) and subscale of the rupture process (RP). First results show that: (1) the 1940 event (h=122 km) ruptured the lower part of the oceanic slab entirely along strike, and down dip, and similarly for 1977 but its upper part, (2) the RZ of 1977 and 1990 events overlap and the first asperity of 1977 event was rebroken in 1990. This shows the size of the events strongly depends on RZ, asperity size/strength and, thus on the failure stress level (FSL), but not on depth, (3) when FSL of high strength (HS) larger zones is critical largest events (eg. 1802, 1940) occur, thus explaining the supercyles (the 1940

  14. Radon/helium studies for earthquake prediction N-W Himalaya

    International Nuclear Information System (INIS)

    Virk, H.S.

    1999-01-01

    The paper presents the preliminary data of radon monitoring stated in the Himalayan orogenic belt. Radon anomalies are correlated with microseismic activity in the N-W Himalaya. The He/Rn ratio will be used as a predictive tool for earthquakes

  15. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.

    2015-12-01

    Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being

  16. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  17. Prediction of the occurrence of related strong earthquakes in Italy

    International Nuclear Information System (INIS)

    Vorobieva, I.A.; Panza, G.F.

    1993-06-01

    In the seismic flow it is often observed that a Strong Earthquake (SE), is followed by Related Strong Earthquakes (RSEs), which occur near the epicentre of the SE with origin time rather close to the origin time of the SE. The algorithm for the prediction of the occurrence of a RSE has been developed and applied for the first time to the seismicity data of the California-Nevada region and has been successfully tested in several regions of the World, the statistical significance of the result being 97%. So far, it has been possible to make five successful forward predictions, with no false alarms or failures to predict. The algorithm is applied here to the Italian territory, where the occurrence of RSEs is a particularly rare phenomenon. Our results show that the standard algorithm is successfully directly applicable without any adjustment of the parameters. Eleven SEs are considered. Of them, three are followed by a RSE, as predicted by the algorithm, eight SEs are not followed by a RSE, and the algorithm predicts this behaviour for seven of them, giving rise to only one false alarm. Since, in Italy, quite often the series of strong earthquakes are relatively short, the algorithm has been extended to handle such situation. The result of this experiment indicates that it is possible to attempt to test a SE, for the occurrence of a RSE, soon after the occurrence of the SE itself, performing timely ''preliminary'' recognition on reduced data sets. This fact, the high confidence level of the retrospective analysis, and the first successful forward predictions, made in different parts of the World, indicates that, even if additional tests are desirable, the algorithm can already be considered for routine application to Civil Defence. (author). Refs, 3 figs, 7 tabs

  18. Radon monitoring and its application for earthquake prediction

    International Nuclear Information System (INIS)

    Ramchandran, T.V.; Shaikh, A.N.; Khan, A.H.; Mayya, Y.S.; Puranik, V.D.; Venkat Raj, V.

    2004-12-01

    Concentrations ofa wide range of terrestrial gases containing radionuclides like 222 Rn (Radon), H 2 (Hydrogen), Hg (Mercury), CO 2 (Carbon dioxide) and He 4 (Helium) in ground water and soil air have commonly been found to be anomalously high along active faults, suggesting that these faults may be the path for least resistance for the out gassing processes of the solid earth. Among the naturally occurring radionucludes, the 238 U decay series has received great attention in connection with the earthquake prediction and monitoring research all over the world. Due to its nearly ubiquitous occurrence, appreciable abundance, chemical inactivity and convenient half-life (3.823 d), 222 Rn in the 238 U series is the most extensively studied one in this regard. In this report, a brief account of the application of 222 Rn monitoring carried out all over the world, studies carried out in India, modeling of earthquake predictions, measurement techniques, measuring equipments, its availability in India, Indian radon monitoring programme and its prospects are presented. (author)

  19. Use of Kazakh nuclear explosions for testing dilatancy diffusion model of earthquake prediction

    International Nuclear Information System (INIS)

    Srivastava, H.N.

    1979-01-01

    P wave travel time anomalies from Kazakh explosions during the years 1965-1972 were studied with reference to Jeffreys Bullen (1952) and Herrin Travel time tables (1968) and discussed using F ratio test at seven stations in Himachal Pradesh. For these events, the temporal and spatial variations of travel time residuals were examined from the point of view of long term changes in velocity known to precede earthquakes and local geology. The results show perference for Herrin Travel time tables at these epicentral distances from Kazakh explosions. F ratio test indicated that variation between sample means of different stations in the network showed more variation than can be attributed to the sampling error. Although the spatial variation of mean residuals (1965-1972) could generally be explained on the basis of the local geology, the temporal variations of such residuals from Kazakh explosions offer limited application in the testing of dilatancy model of earthquake prediction. (auth.)

  20. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  1. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  2. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    Science.gov (United States)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  3. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  4. Can radon gas measurements be used to predict earthquakes?

    International Nuclear Information System (INIS)

    2009-01-01

    After the tragic earthquake of April 6, 2009 in Aquila (Abruzzo), a debate has begun in Italy regarding the alleged prediction of this earthquake by a scientist working in the Gran Sasso National Laboratory, based on radon content measurements. Radon is a radioactive gas originating from the decay of natural radioactive elements present in the soil. IRSN specialists are actively involved in ongoing research projects on the impact of mechanical stresses on radon emissions from underground structures, and some of their results dating from several years ago are being brought up in this debate. These specialists are therefore currently presenting their perspective on the relationships between radon emissions and seismic activity, based on publications on the subject. (authors)

  5. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  6. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  7. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  8. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.; Mai, Paul Martin; Thingbaijam, Kiran Kumar; Razafindrakoto, H. N. T.; Genton, Marc G.

    2014-01-01

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  9. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  10. Earthquake prediction using extinct monogenetic volcanoes: A possible new research strategy

    Science.gov (United States)

    Szakács, Alexandru

    2011-04-01

    Volcanoes are extremely effective transmitters of matter, energy and information from the deep Earth towards its surface. Their capacities as information carriers are far to be fully exploited so far. Volcanic conduits can be viewed in general as rod-like or sheet-like vertical features with relatively homogenous composition and structure crosscutting geological structures of far more complexity and compositional heterogeneity. Information-carrying signals such as earthquake precursor signals originating deep below the Earth surface are transmitted with much less loss of information through homogenous vertically extended structures than through the horizontally segmented heterogeneous lithosphere or crust. Volcanic conduits can thus be viewed as upside-down "antennas" or waveguides which can be used as privileged pathways of any possible earthquake precursor signal. In particular, conduits of monogenetic volcanoes are promising transmitters of deep Earth information to be received and decoded at surface monitoring stations because the expected more homogenous nature of their rock-fill as compared to polygenetic volcanoes. Among monogenetic volcanoes those with dominantly effusive activity appear as the best candidates for privileged earthquake monitoring sites. In more details, effusive monogenetic volcanic conduits filled with rocks of primitive parental magma composition indicating direct ascent from sub-lithospheric magma-generating areas are the most suitable. Further selection criteria may include age of the volcanism considered and the presence of mantle xenoliths in surface volcanic products indicating direct and straightforward link between the deep lithospheric mantle and surface through the conduit. Innovative earthquake prediction research strategies can be based and developed on these grounds by considering conduits of selected extinct monogenetic volcanoes and deep trans-crustal fractures as privileged emplacement sites of seismic monitoring stations

  11. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  12. Long-term earthquake forecasts based on the epidemic-type aftershock sequence (ETAS model for short-term clustering

    Directory of Open Access Journals (Sweden)

    Jiancang Zhuang

    2012-07-01

    Full Text Available Based on the ETAS (epidemic-type aftershock sequence model, which is used for describing the features of short-term clustering of earthquake occurrence, this paper presents some theories and techniques related to evaluating the probability distribution of the maximum magnitude in a given space-time window, where the Gutenberg-Richter law for earthquake magnitude distribution cannot be directly applied. It is seen that the distribution of the maximum magnitude in a given space-time volume is determined in the longterm by the background seismicity rate and the magnitude distribution of the largest events in each earthquake cluster. The techniques introduced were applied to the seismicity in the Japan region in the period from 1926 to 2009. It was found that the regions most likely to have big earthquakes are along the Tohoku (northeastern Japan Arc and the Kuril Arc, both with much higher probabilities than the offshore Nankai and Tokai regions.

  13. Long-term perspectives on giant earthquakes and tsunamis at subduction zones

    Science.gov (United States)

    Satake, K.; Atwater, B.F.; ,

    2007-01-01

    Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.

  14. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation

    Science.gov (United States)

    Thomas, J.N.; Masci, F; Love, Jeffrey J.

    2015-01-01

    Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes.

  15. Flicker-noise Spectroscopy In Earthquake Prediction Research

    Science.gov (United States)

    Desherevsky, A. V.; Lukk, A. A.; Sidorin, A. Y.; Timashev, S. F.

    It has been found out that a two-component model including a seasonal and a flicker- noise components occurs to be a more adequate model of statistical structure of time series of long-term geophysical observations' data. Unlike a white noise which sig- nifies absence of any relation between the system's current dynamics and past events in it, presence of flicker-noise indicates that such a relation in the system does ex- ist. Flicker-noise pertains a property of scale invariance. It seems natural to relate self-similarity of statistical properties of geophysical parameters' variations on dif- ferent scales to self-similar (fractal) properties of geophysical medium. At the same time self-similar time variations of geophysical parameters may indicate to presence of deterministic chaos in geophysical system's evolution. An important element of a proposed approach is application of stochastic models of preparation of each concrete large seismic event. Instead of regular, for example bay-form precursor variations, occurrence of precursors of another kind associated in particular with variation in parameter fluctuations should be expected. To solve a problem of large earthquakes prediction we use Flicker-Noise Spectroscopy (FNS) as a basis of a new approach proposed by us. The basis of the FNS methodology is a postulate about the impor- tant information significance of sequences of various dynamic irregularities (bursts or spikes, jumps with different characteristic values, discontinuities of derivatives) of the measured temporal, spatial and energetic variables on each level of hierarchical orga- nization of studied systems. A proposed new method using integral values of analyzed signals - power spectra and different moments ("structural functions") of a different order as information relations, has demonstrated principally new opportunities in a search of large earthquake precursors already at a preliminary stage of some data analysis. This research was supported by

  16. Relaxation creep model of impending earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Morgounov, V. A. [Russian Academy of Sciences, Institute of Physics of the Earth, Moscow (Russian Federation)

    2001-04-01

    The alternative view of the current status and perspective of seismic prediction studies is discussed. In the problem of the ascertainment of the uncertainty relation Cognoscibility-Unpredictability of Earthquakes, priorities of works on short-term earthquake prediction are defined due to the advantage that the final stage of nucleation of earthquake is characterized by a substantial activation of the process while its strain rate increases by the orders of magnitude and considerably increased signal-to-noise ratio. Based on the creep phenomenon under stress relaxation conditions, a model is proposed to explain different images of precursors of impending tectonic earthquakes. The onset of tertiary creep appears to correspond to the onset of instability and inevitably fails unless it unloaded. At this stage, the process acquires the self-regulating character to the greatest extent the property of irreversibility, one of the important components of prediction reliability. Data in situ suggest a principal possibility to diagnose the process of preparation by ground measurements of acoustic and electromagnetic emission in the rocks under constant strain in the condition of self-relaxed stress until the moment of fracture are discussed in context. It was obtained that electromagnetic emission precedes but does not accompany the phase of macrocrak development.

  17. Real-time 3-D space numerical shake prediction for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang

    2017-12-01

    In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model.

  18. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  19. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  20. Earthquake chemical precursors in groundwater: a review

    Science.gov (United States)

    Paudel, Shukra Raj; Banjara, Sushant Prasad; Wagle, Amrita; Freund, Friedemann T.

    2018-03-01

    We review changes in groundwater chemistry as precursory signs for earthquakes. In particular, we discuss pH, total dissolved solids (TDS), electrical conductivity, and dissolved gases in relation to their significance for earthquake prediction or forecasting. These parameters are widely believed to vary in response to seismic and pre-seismic activity. However, the same parameters also vary in response to non-seismic processes. The inability to reliably distinguish between changes caused by seismic or pre-seismic activities from changes caused by non-seismic activities has impeded progress in earthquake science. Short-term earthquake prediction is unlikely to be achieved, however, by pH, TDS, electrical conductivity, and dissolved gas measurements alone. On the other hand, the production of free hydroxyl radicals (•OH), subsequent reactions such as formation of H2O2 and oxidation of As(III) to As(V) in groundwater, have distinctive precursory characteristics. This study deviates from the prevailing mechanical mantra. It addresses earthquake-related non-seismic mechanisms, but focused on the stress-induced electrification of rocks, the generation of positive hole charge carriers and their long-distance propagation through the rock column, plus on electrochemical processes at the rock-water interface.

  1. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    Science.gov (United States)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  2. CSEP-Japan: The Japanese node of the collaboratory for the study of earthquake predictability

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2011-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project of earthquake predictability research. The final goal of this project is to have a look for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined the CSEP and started the Japanese testing center called as CSEP-Japan. This testing center constitutes an open access to researchers contributing earthquake forecast models for applied to Japan. A total of 91 earthquake forecast models were submitted on the prospective experiment starting from 1 November 2009. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by the CSEP. The experiments of 1-day, 3-month, 1-year and 3-year forecasting classes were implemented for 92 rounds, 4 rounds, 1round and 0 round (now in progress), respectively. The results of the 3-month class gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space-distribution with most models in some cases where many earthquakes occurred at the same spot. Throughout the experiment, it has been clarified that some properties of the CSEP's evaluation tests such as the L-test show strong correlation with the N-test. We are now processing to own (cyber-) infrastructure to support the forecast experiment as follows. (1) Japanese seismicity has changed since the 2011 Tohoku earthquake. The 3rd call for forecasting models was announced in order to promote model improvement for forecasting earthquakes after this earthquake. So, we provide Japanese seismicity catalog maintained by JMA for modelers to study how seismicity

  3. Report by the 'Mega-earthquakes and mega-tsunamis' subgroup

    International Nuclear Information System (INIS)

    Friedel, Jacques; Courtillot, Vincent; Dercourt, Jean; Jaupart, Claude; Le Pichon, Xavier; Poirier, Jean-Paul; Salencon, Jean; Tapponnier, Paul; Dautray, Robert; Carpentier, Alain; Taquet, Philippe; Blanchet, Rene; Le Mouel, Jean-Louis; BARD, Pierre-Yves; Bernard, Pascal; Montagner, Jean-Paul; Armijo, Rolando; Shapiro, Nikolai; Tait, Steve; Cara, Michel; Madariaga, Raul; Pecker, Alain; Schindele, Francois; Douglas, John

    2011-06-01

    This report comprises a presentation of scientific data on subduction earthquakes, on tsunamis and on the Tohoku earthquake. It proposes a detailed description of the French situation (in the West Indies, in metropolitan France, and in terms of soil response), and a discussion of social and economic issues (governance, seismic regulation and nuclear safety, para-seismic protection of constructions). The report is completed by other large documents: presentation of data on the Japanese earthquake, discussion on prediction and governance errors in the management of earthquake mitigation in Japan, discussions on tsunami prevention, on needs of research on accelerometers, and on the seismic risk in France

  4. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  5. A global earthquake discrimination scheme to optimize ground-motion prediction equation selection

    Science.gov (United States)

    Garcia, Daniel; Wald, David J.; Hearne, Michael

    2012-01-01

    We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs.

  6. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

    Science.gov (United States)

    Bell, Andrew F.

    2018-02-01

    Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

  7. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  8. Measuring the effectiveness of earthquake forecasting in insurance strategies

    Science.gov (United States)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  9. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  10. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan.

    Science.gov (United States)

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathered as the control group who have not exposed to any earthquake. The numbers of subjects who were first identified with diabetes or had normal FBG but with diabetic history were added for the calculation of diabetes prevalence. Statistic-analysis was applied on the baseline data, and incidences of IFG as well as diabetes among all groups. Statistic comparisons indicate there is no significant difference on average fasting glucose levels between the control group and the exposure group. However, the prevalence of IFG and diabetes among the exposure group displays significant variance with the control group. The prevalence of diabetes among exposure groups is significantly higher than the control group. Women are more likely to have diabetes after experiencing earthquake stresses compared to men. The earthquake stress was linked to higher diabetes incidence as an independent factor. The earthquake stress has long-term impacts on diabetes incidence as an independent risk factor. Emerging and long-term managements regarding the care of IFG and diabetes in populations exposed to earthquake stress should be concerned.

  11. Damage Level Prediction of Reinforced Concrete Building Based on Earthquake Time History Using Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Suryanita Reni

    2017-01-01

    Full Text Available The strong motion earthquake could cause the building damage in case of the building not considered in the earthquake design of the building. The study aims to predict the damage-level of building due to earthquake using Artificial Neural Networks method. The building model is a reinforced concrete building with ten floors and height between floors is 3.6 m. The model building received a load of the earthquake based on nine earthquake time history records. Each time history scaled to 0,5g, 0,75g, and 1,0g. The Artificial Neural Networks are designed in 4 architectural models using the MATLAB program. Model 1 used the displacement, velocity, and acceleration as input and Model 2 used the displacement only as the input. Model 3 used the velocity as input, and Model 4 used the acceleration just as input. The output of the Neural Networks is the damage level of the building with the category of Safe (1, Immediate Occupancy (2, Life Safety (3 or in a condition of Collapse Prevention (4. According to the results, Neural Network models have the prediction rate of the damage level between 85%-95%. Therefore, one of the solutions for analyzing the structural responses and the damage level promptly and efficiently when the earthquake occurred is by using Artificial Neural Network

  12. Tokyo Metropolitan Earthquake Preparedness Project - A Progress Report

    Science.gov (United States)

    Hayashi, H.

    2010-12-01

    Munich Re once ranked that Tokyo metropolitan region, the capital of Japan, is the most vulnerable area for earthquake disasters, followed by San Francisco Bay Area, US and Osaka, Japan. Seismologists also predict that Tokyo metropolitan region may have at least one near-field earthquake with a probability of 70% for the next 30 years. Given this prediction, Japanese Government took it seriously to conduct damage estimations and revealed that, as the worst case scenario, if a7.3 magnitude earthquake under heavy winds as shown in the fig. 1, it would kill a total of 11,000 people and a total of direct and indirect losses would amount to 112,000,000,000,000 yen(1,300,000,000,000, 1=85yen) . In addition to mortality and financial losses, a total of 25 million people would be severely impacted by this earthquake in four prefectures. If this earthquake occurs, 300,000 elevators will be stopped suddenly, and 12,500 persons would be confined in them for a long time. Seven million people will come to use over 20,000 public shelters spread over the impacted area. Over one millions temporary housing units should be built to accommodate 4.6 million people who lost their dwellings. 2.5 million people will relocate to outside of the damaged area. In short, an unprecedented scale of earthquake disaster is expected and we must prepare for it. Even though disaster mitigation is undoubtedly the best solution, it is more realistic that the expected earthquake would hit before we complete this business. In other words, we must take into account another solution to make the people and the assets in this region more resilient for the Tokyo metropolitan earthquake. This is the question we have been tackling with for the last four years. To increase societal resilience for Tokyo metropolitan earthquake, we adopted a holistic approach to integrate both emergency response and long-term recovery. There are three goals for long-term recovery, which consists of Physical recovery, Economic

  13. [Medium- and long-term health effects of the L'Aquila earthquake (Central Italy, 2009) and of other earthquakes in high-income Countries: a systematic review].

    Science.gov (United States)

    Ripoll Gallardo, Alba; Alesina, Marta; Pacelli, Barbara; Serrone, Dario; Iacutone, Giovanni; Faggiano, Fabrizio; Della Corte, Francesco; Allara, Elias

    2016-01-01

    to compare the methodological characteristics of the studies investigating the middle- and long-term health effects of the L'Aquila earthquake with the features of studies conducted after other earthquakes occurred in highincome Countries. a systematic comparison between the studies which evaluated the health effects of the L'Aquila earthquake (Central Italy, 6th April 2009) and those conducted after other earthquakes occurred in comparable settings. Medline, Scopus, and 6 sources of grey literature were systematically searched. Inclusion criteria comprised measurement of health outcomes at least one month after the earthquake, investigation of earthquakes occurred in high-income Countries, and presence of at least one temporal or geographical control group. out of 2,976 titles, 13 studies regarding the L'Aquila earthquake and 51 studies concerning other earthquakes were included. The L'Aquila and the Kobe/Hanshin- Awaji (Japan, 17th January 1995) earthquakes were the most investigated. Studies on the L'Aquila earthquake had a median sample size of 1,240 subjects, a median duration of 24 months, and used most frequently a cross sectional design (7/13). Studies on other earthquakes had a median sample size of 320 subjects, a median duration of 15 months, and used most frequently a time series design (19/51). the L'Aquila studies often focussed on mental health, while the earthquake effects on mortality, cardiovascular outcomes, and health systems were less frequently evaluated. A more intensive use of routine data could benefit future epidemiological surveillance in the aftermath of earthquakes.

  14. Gambling score in earthquake prediction analysis

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  15. Long-term effects of earthquake experience of young persons on cardiovascular disease risk factors

    Science.gov (United States)

    Li, Na; Wang, Yumei; Yu, Lulu; Song, Mei; Wang, Lan; Ji, Chunpeng

    2016-01-01

    Introduction The aim of the study was to study the long-term effect on cardiovascular disease risk factors of stress from direct experience of an earthquake as a young person. Material and methods We selected workers born between July 1, 1958 and July 1, 1976 who were examined at Kailuan General Hospital between May and October of 2013. Data on cardiovascular events were taken during the workers’ annual health examination conducted between 2006 and 2007. All subjects were divided into three groups according to their experience of the Tangshan earthquake of July 28, 1976, as follows: control group; exposed group 1 and exposed group 2. We compared cardiovascular disease risk factors between the three groups as well as by gender and age. Results One thousand one hundred and ninety-six workers were included in the final statistical analysis. Among all subjects, resting heart rate (p = 0.003), total cholesterol (p earthquake compared with unexposed controls, but were unrelated to loss of relatives. No significant difference in triglyceride levels was observed between the three groups (p = 0.900). Further refinement showed that the effects were restricted to males 40 years of age or older at the time of analysis, but were due primarily to age at the time of earthquake exposure (p = 0.002, p Earthquake experience in the early years of life has long-term effects on adult resting heart rate, total cholesterol, and fasting plasma glucose, especially among men. PMID:28144258

  16. Predicted Attenuation Relation and Observed Ground Motion of Gorkha Nepal Earthquake of 25 April 2015

    Science.gov (United States)

    Singh, R. P.; Ahmad, R.

    2015-12-01

    A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters.

  17. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  18. Statistical aspects and risks of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  19. Prediction of site specific ground motion for large earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1990-01-01

    In this paper, we apply the semi-empirical synthesis method by IRIKURA (1983, 1986) to the estimation of site specific ground motion using accelerograms observed at Kumatori in Osaka prefecture. Target earthquakes used here are a comparatively distant earthquake (Δ=95 km, M=5.6) caused by the YAMASAKI fault and a near earthquake (Δ=27 km, M=5.6). The results obtained are as follows. 1) The accelerograms from the distant earthquake (M=5.6) are synthesized using the aftershock records (M=4.3) for 1983 YAMASAKI fault earthquake whose source parameters have been obtained by other authors from the hypocentral distribution of the aftershocks. The resultant synthetic motions show a good agreement with the observed ones. 2) The synthesis for a near earthquake (M=5.6, we call this target earthquake) are made using a small earthquake which occurred in the neighborhood of the target earthquake. Here, we apply two methods for giving the parameters for synthesis. One method is to use the parameters of YAMASAKI fault earthquake which has the same magnitude as the target earthquake, and the other is to use the parameters obtained from several existing empirical formulas. The resultant synthetic motion with the former parameters shows a good agreement with the observed one, but that with the latter does not. 3) We estimate the source parameters from the source spectra of several earthquakes which have been observed in this site. Consequently we find that the small earthquakes (M<4) as Green's functions should be carefully used because the stress drops are not constant. 4) We propose that we should designate not only the magnitudes but also seismic moments of the target earthquake and the small earthquake. (J.P.N.)

  20. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values

    Science.gov (United States)

    Baltay, A.; Hanks, T. C.; Vernon, F.

    2016-12-01

    We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for induced earthquakes in the central US.

  1. Development of compact long-term broadband ocean bottom seismometer for seafloor observation of slow earthquakes

    Science.gov (United States)

    Yamashita, Y.; Shinohara, M.; Yamada, T.; Shiobara, H.

    2017-12-01

    It is important to understand coupling between plates in a subduction zone for studies of earthquake generation. Recently low frequency tremor and very low frequency earthquake (VLFE) were discovered in plate boundary near a trench. These events (slow earthquakes) in shallow plate boundary should be related to slow slip on a plate boundary. For observation of slow earthquakes, Broad Band Ocean Bottom Seismometer (BBOBS) is useful, however a number of BBOBSs are limited due to cost. On the other hand, a number of Long-term OBSs (LT-OBSs) with recording period of one year are available. However, the LT-OBS has seismometer with a natural period of 1 second. Therefore frequency band of observation is slightly narrow for slow earthquakes. Therefore we developed a compact long-term broad-band OBS by replacement of the seismic sensor of the LT-OBSs to broadband seismometer.We adopted seismic sensor with natural period of 20 seconds (Trillium Compact Broadband Seismometer, Nanometrics). Because tilt of OBS on seafloor can not be controlled due to free-fall, leveling system for seismic sensor is necessary. The broadband seismic senor has cylinder shape with diameter of 90 mm and height of 100 mm, and the developed levelling system can mount the seismic sensor with no modification of shape. The levelling system has diameter of 160 mm and height of 110 mm, which is the same size as existing levelling system of the LT-OBS. The levelling system has two horizontal axes and each axis is driven by motor. Leveling can be performed up to 20 degrees by using micro-processor (Arduino). Resolution of levelling is less than one degree. The system immediately starts leveling by the power-on of controller. After levelling, the the seismic senor is powered and the controller records angles of levelling to SD RAM. Then the controller is shut down to consume no power. Compact long-term broadband ocean bottom seismometer is useful for observation of slow earthquakes on seafloor. In addition

  2. Earthquake prediction research with plastic nuclear track detectors

    International Nuclear Information System (INIS)

    Woith, H.; Enge, W.; Beaujean, R.; Oschlies, K.

    1988-01-01

    Since 1984 a German-Turkish project on earthquake prediction research has been operating at the North Anatolian fault zone in Turkey. Among many other parameters changes in Radon emission have also been investigated. Plastic nuclear track detectors (Kodak cellulose nitrate LR 115) are used to record alpha-particles emitted from Radon and Thoron atoms and their daughter isotopes. The detectors are replaced and analyzed every 3 weeks. Thus a quasi-continuous time sequence of the Radon soil gas emission is recorded. We present a comparison between measurements made with electronic counters and plastic track detectors. (author)

  3. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    Science.gov (United States)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  4. Earthquake precursory events around epicenters and local active faults

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S. B.; Haydari Azad, F.

    2013-05-01

    The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past

  5. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  6. The ordered network structure of M {>=} 6 strong earthquakes and its prediction in the Jiangsu-South Yellow Sea region

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei [Nanjing Univ. of Information Science and Technology (China). College of Mathematics and Statistics; Cui, Lei [California Univ., Santa Barbara, CA (United States). Applied Probability and Statistics Dept.

    2013-05-15

    The the Jiangsu-South Yellow Sea region is one of the key seismic monitoring defence areas in the eastern part of China. Since 1846, M {>=} 6 strong earthquakes have showed an obvious commensurability and orderliness in this region. The main orderly values are 74 {proportional_to} 75 a, 57 {proportional_to} 58 a, 11 {proportional_to} 12 a, and 5 {proportional_to} 6 a, wherein 74 {proportional_to} 75 a and 57 {proportional_to} 58 a with an outstanding predictive role. According to the information prediction theory of Wen-Bo Weng, we conceived the M {>=} 6 strong earthquake ordered network structure in the South Yellow Sea and the whole region. Based on this, we analyzed and discussed the variation of seismicity in detail and also made a trend prediction of M {>=} 6 strong earthquakes in the future. The results showed that since 1998 it has entered into a new quiet episode which may continue until about 2042; and the first M {>=} 6 strong earthquake in the next active episode will probably occur in 2053 pre and post, with the location likely in the sea area of the South Yellow Sea; also, the second and the third ones or strong earthquake swarm in the future will probably occur in 2058 and 2070 pre and post. (orig.)

  7. Long-term change of activity of very low-frequency earthquakes in southwest Japan

    Science.gov (United States)

    Baba, S.; Takeo, A.; Obara, K.; Kato, A.; Maeda, T.; Matsuzawa, T.

    2017-12-01

    On plate interface near seismogenic zone of megathrust earthquakes, various types of slow earthquakes were detected including non-volcanic tremors, slow slip events (SSEs) and very low-frequency earthquakes (VLFEs). VLFEs are classified into deep VLFEs, which occur in the downdip side of the seismogenic zone, and shallow VLFEs, occur in the updip side, i.e. several kilometers in depth in southwest Japan. As a member of slow earthquake family, VLFE activity is expected to be a proxy of inter-plate slipping because VLFEs have the same mechanisms as inter-plate slipping and are detected during Episodic tremor and slip (ETS). However, long-term change of the VLFE seismicity has not been well constrained compared to deep low-frequency tremor. We thus studied long-term changes in the activity of VLFEs in southwest Japan where ETS and long-term SSEs have been most intensive. We used continuous seismograms of F-net broadband seismometers operated by NIED from April 2004 to March 2017. After applying the band-pass filter with a frequency range of 0.02—0.05 Hz, we adopted the matched-filter technique in detecting VLFEs. We prepared templates by calculating synthetic waveforms for each hypocenter grid assuming typical focal mechanisms of VLFEs. The correlation coefficients between templates and continuous F-net seismograms were calculated at each grid every 1s in all components. The grid interval is 0.1 degree for both longitude and latitude. Each VLFE was detected as an event if the average of correlation coefficients exceeds the threshold. We defined the detection threshold as eight times as large as the median absolute deviation of the distribution. At grids in the Bungo channel, where long-term SSEs occurred frequently, the cumulative number of detected VLFEs increases rapidly in 2010 and 2014, which were modulated by stress loading from the long-term SSEs. At inland grids near the Bungo channel, the cumulative number increases steeply every half a year. This stepwise

  8. The bayesian probabilistic prediction of the next earthquake in the ometepec segment of the mexican subduction zone

    Science.gov (United States)

    Ferraes, Sergio G.

    1992-06-01

    A predictive equation to estimate the next interoccurrence time (τ) for the next earthquake ( M≥6) in the Ometepec segment is presented, based on Bayes' theorem and the Gaussian process. Bayes' theorem is used to relate the Gaussian process to both a log-normal distribution of recurrence times (τ) and a log-normal distribution of magnitudes ( M) ( Nishenko and Buland, 1987; Lomnitz, 1964). We constructed two new random variables X=In M and Y=In τ with normal marginal densities, and based on the Gaussian process model we assume that their joint density is normal. Using this information, we determine the Bayesian conditional probability. Finally, a predictive equation is derived, based on the criterion of maximization of the Bayesian conditional probability. The model forecasts the next interoccurrence time, conditional on the magnitude of the last event. Realistic estimates of future damaging earthquakes are based on relocated historical earthquakes. However, at the present time there is a controversy between Nishenko-Singh and Gonzalez-Ruiz-Mc-Nally concerning the rupturing process of the 1907 earthquake. We use our Bayesian analysis to examine and discuss this very important controversy. To clarify to the full significance of the analysis, we put forward the results using two catalogues: (1) The Ometepec catalogue without the 1907 earthquake (González-Ruíz-McNally), and (2) the Ometepec catalogue including the 1907 earthquake (Nishenko-Singh). The comparison of the prediction error reveals that in the Nishenko-Singh catalogue, the errors are considerably smaller than the average error for the González-Ruíz-McNally catalogue of relocated events. Finally, using the Nishenko-Singh catalogue which locates the 1907 event inside the Ometepec segment, we conclude that the next expected damaging earthquake ( M≥6.0) will occur approximately within the next time interval τ=11.82 years from the last event (which occurred on July 2, 1984), or equivalently will

  9. Near-fault earthquake ground motion prediction by a high-performance spectral element numerical code

    International Nuclear Information System (INIS)

    Paolucci, Roberto; Stupazzini, Marco

    2008-01-01

    Near-fault effects have been widely recognised to produce specific features of earthquake ground motion, that cannot be reliably predicted by 1D seismic wave propagation modelling, used as a standard in engineering applications. These features may have a relevant impact on the structural response, especially in the nonlinear range, that is hard to predict and to be put in a design format, due to the scarcity of significant earthquake records and of reliable numerical simulations. In this contribution a pilot study is presented for the evaluation of seismic ground-motions in the near-fault region, based on a high-performance numerical code for 3D seismic wave propagation analyses, including the seismic fault, the wave propagation path and the near-surface geological or topographical irregularity. For this purpose, the software package GeoELSE is adopted, based on the spectral element method. The set-up of the numerical benchmark of 3D ground motion simulation in the valley of Grenoble (French Alps) is chosen to study the effect of the complex interaction between basin geometry and radiation mechanism on the variability of earthquake ground motion

  10. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  11. Case history of an anticipated event: The major (Mw = 7.0) Vrancea, Romania earthquake of 1986 - revisited

    International Nuclear Information System (INIS)

    Marza, V.; Burlacu, B V.; Pantea, A; Malita, Z.

    2002-01-01

    This is a reissue of a paper initially published in the European Seismological Commission Proceedings of the XXI General Assembly held on 23-27 August 1988 in Sofia, Bulgaria, p. 515-523, and released in 1989. We present here an excerpt of the original paper, taking only advantage of the modern digital graphics, removing some 'typing' mistakes or adding some explanatory late notes, in order to remember the conspicuous earthquake prediction research results done by Romanian seismology after the forecasted 1977 Vrancea major event. For the sake of understanding we distinguish between earthquake forecasting (long-term prediction, that is a time-window of years, but less than 20% of the mean return period for the involved magnitude and a lead time of years) and earthquake anticipation (medium-term prediction, i.e. a time-window of a few months and a lead time of months), stages what proved to be feasible for Vrancea seismogenic zone. Analysis and discussion of a variety of precursory seismicity patterns (p.s.p.) belonging to all temporal developmental stages of the preparatory (geo)physical process leading to the killer and damaging major subcrustal Vrancea, Romania, earthquake of August 30, 1986 (epicenter = 45.5 angle N/26.4 angle E; depth 144 km; magnitude(s) m w =7.0, M w =7.3, M L =7.0; I o =VIII 1/2 MSK) are performed and documented, clearly proving that the earthquake would not has been unexpected. The salient features of the Vrancea Seismogenic Zone (VSZ) and its tectonic setting have been presented elsewhere. The seismological data base used in this study is the earthquake master catalogue of Constantinescu and Marza, updated on the basis of the data supplied by the real-time telemetered seismographic network of Romania, centered on VSZ. The contents of the paper is as follows: 1. Introduction; 2. The Vrancea 1986 Major (m w =7.0) Subcrustal Earthquake Related Precursors; 2.1. Regularity Patterns; 2.2. Preseismic Quiescence; 2.3. Hypocentral migration

  12. Inelastic spectra to predict period elongation of structures under earthquake loading

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sextos, A.G.

    2015-01-01

    Period lengthening, exhibited by structures when subjected to strong ground motions, constitutes an implicit proxy of structural inelasticity and associated damage. However, the reliable prediction of the inelastic period is tedious and a multi-parametric task, which is related to both epistemic ...... for period lengthening as a function of Ry and Tel. These equations may be used in the framework of the earthquake record selection and scaling....

  13. Combining multiple earthquake models in real time for earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  14. AN EFFECTIVE HYBRID SUPPORT VECTOR REGRESSION WITH CHAOS-EMBEDDED BIOGEOGRAPHY-BASED OPTIMIZATION STRATEGY FOR PREDICTION OF EARTHQUAKE-TRIGGERED SLOPE DEFORMATIONS

    Directory of Open Access Journals (Sweden)

    A. A. Heidari

    2015-12-01

    Full Text Available Earthquake can pose earth-shattering health hazards to the natural slops and land infrastructures. One of the chief consequences of the earthquakes can be land sliding, which is instigated by durable shaking. In this research, an efficient procedure is proposed to assist the prediction of earthquake-originated slope displacements (EIDS. New hybrid SVM-CBBO strategy is implemented to predict the EIDS. For this purpose, first, chaos paradigm is combined with initialization of BBO to enhance the diversification and intensification capacity of the conventional BBO optimizer. Then, chaotic BBO is developed as the searching scheme to investigate the best values of SVR parameters. In this paper, it will be confirmed that how the new computing approach is effective in prediction of EIDS. The outcomes affirm that the SVR-BBO strategy with chaos can be employed effectively as a predicting tool for evaluating the EIDS.

  15. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  16. Geosphere Stability for long-term isolation of radioactive waste. Case study for hydrological change with earthquakes and faulting

    International Nuclear Information System (INIS)

    Niwa, Masakazu

    2016-01-01

    Appropriate estimation and safety assessment for long-term changes in geological environment are essential to an improvement of reliability for geological disposal. Specifically, study on faults is important for understanding regional groundwater flow as well as an assessment as a trigger of future earthquakes. Here, possibility of changes in permeability of faulted materials induced by earthquakes was examined based on monitoring data of groundwater pressure before and after the 2011 off the Pacific coast of Tohoku Earthquake. (author)

  17. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity

    Science.gov (United States)

    Zoeller, G.

    2017-12-01

    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  18. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  19. State Vector: A New Approach to Prediction of the Failure of Brittle Heterogeneous Media and Large Earthquakes

    Science.gov (United States)

    Yu, Huai-Zhong; Yin, Xiang-Chu; Zhu, Qing-Yong; Yan, Yu-Ding

    2006-12-01

    The concept of state vector stems from statistical physics, where it is usually used to describe activity patterns of a physical field in its manner of coarsegrain. In this paper, we propose an approach by which the state vector was applied to describe quantitatively the damage evolution of the brittle heterogeneous systems, and some interesting results are presented, i.e., prior to the macro-fracture of rock specimens and occurrence of a strong earthquake, evolutions of the four relevant scalars time series derived from the state vectors changed anomalously. As retrospective studies, some prominent large earthquakes occurred in the Chinese Mainland (e.g., the M 7.4 Haicheng earthquake on February 4, 1975, and the M 7.8 Tangshan earthquake on July 28, 1976, etc) were investigated. Results show considerable promise that the time-dependent state vectors could serve as a kind of precursor to predict earthquakes.

  20. Analysis of Earthquake Catalogs for CSEP Testing Region Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Romashkova, L.; Nekrasova, A.; Kossobokov, V.; Panza, G.F.

    2010-07-01

    A comprehensive analysis shows that the set of catalogs provided by the Istituto Nazionale di Geofisica e Vulcanologia (INGV, Italy) as the authoritative database for the Collaboratory for the Study of Earthquake Predictability - Testing Region Italy (CSEP-TRI), is hardly a unified one acceptable for the necessary tuning of models/algorithms, as well as for running rigorous prospective predictability tests at intermediate- or long-term scale. (author)

  1. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters

    Science.gov (United States)

    Marc, Odin; Meunier, Patrick; Hovius, Niels

    2017-07-01

    We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

  2. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan

    OpenAIRE

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    Objective: To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. Methods: 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathe...

  3. Limits on the potential accuracy of earthquake risk evaluations using the L’Aquila (Italy earthquake as an example

    Directory of Open Access Journals (Sweden)

    John Douglas

    2015-06-01

    Full Text Available This article is concerned with attempting to ‘predict’ (hindcast the damage caused by the L’Aquila 2009 earthquake (Mw 6.3 and, more generally, with the question of how close predicted damage can ever be to observations. Damage is hindcast using a well-established empirical-based approach based on vulnerability indices and macroseismic intensities, adjusted for local site effects. Using information that was available before the earthquake and assuming the same event characteristics as the L’Aquila mainshock, the overall damage is reasonably well predicted but there are considerable differences in the damage pattern. To understand the reasons for these differences, information that was only available after the event were include within the calculation. Despite some improvement in the predicted damage, in particularly by the modification of the vulnerability indices and the parameter influencing the width of the damage distribution, these hindcasts do not match all the details of the observations. This is because of local effects: both in terms of the ground shaking, which is only detectable by the installation of a much denser strong-motion network and a detailed microzonation, and in terms of the building vulnerability, which cannot be modeled using a statistical approach but would require detailed analytical modeling for which calibration data are likely to be lacking. Future studies should concentrate on adjusting the generic components of the approach to make them more applicable to their location of interest. To increase the number of observations available to make these adjustments, we encourage the collection of damage states (and not just habitability classes following earthquakes and also the installation of dense strong-motion networks in built-up areas.

  4. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  5. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  6. Interevent times in a new alarm-based earthquake forecasting model

    Science.gov (United States)

    Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed

    2013-09-01

    This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the

  7. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  8. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  9. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  10. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  11. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  12. Comparison of Ground Motion Prediction Equations (GMPE) for Chile and Canada With Recent Chilean Megathust Earthquakes

    Science.gov (United States)

    Herrera, C.; Cassidy, J. F.; Dosso, S. E.

    2017-12-01

    The ground shaking assessment allows quantifying the hazards associated with the occurrence of earthquakes. Chile and western Canada are two areas that have experienced, and are susceptible to imminent large crustal, in-slab and megathrust earthquakes that can affect the population significantly. In this context, we compare the current GMPEs used in the 2015 National Building Code of Canada and the most recent GMPEs calculated for Chile, with observed accelerations generated by four recent Chilean megathrust earthquakes (MW ≥ 7.7) that have occurred during the past decade, which is essential to quantify how well current models predict observations of major events.We collected the 3-component waveform data of more than 90 stations from the Centro Sismologico Nacional and the Universidad de Chile, and processed them by removing the trend and applying a band-pass filter. Then, for each station, we obtained the Peak Ground Acceleration (PGA), and by using a damped response spectra, we calculated the Pseudo Spectral Acceleration (PSA). Finally, we compared those observations with the most recent Chilean and Canadian GMPEs. Given the lack of geotechnical information for most of the Chilean stations, we also used a new method to obtain the VS30 by inverting the H/V ratios using a trans-dimensional Bayesian inversion, which allows us to improve the correction of observations according to soil conditions.As expected, our results show a good fit between observations and the Chilean GMPEs, but we observe that although the shape of the Canadian GMPEs is coherent with the distribution of observations, in general they under predict the observations for PGA and PSA at shorter periods for most of the considered earthquakes. An example of this can be seen in the attached figure for the case of the 2014 Iquique earthquake.These results present important implications related to the hazards associated to large earthquakes, especially for western Canada, where the probability of a

  13. Radon anomaly in soil gas as an earthquake precursor

    International Nuclear Information System (INIS)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D.; Planinic, J.

    2008-01-01

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M≥3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T

  14. Radon anomaly in soil gas as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia); Planinic, J. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia)], E-mail: planinic@ffos.hr

    2008-10-15

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M{>=}3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T.

  15. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  16. Physical bases of the generation of short-term earthquake precursors: A complex model of ionization-induced geophysical processes in the lithosphere-atmosphere-ionosphere-magnetosphere system

    Science.gov (United States)

    Pulinets, S. A.; Ouzounov, D. P.; Karelin, A. V.; Davidenko, D. V.

    2015-07-01

    This paper describes the current understanding of the interaction between geospheres from a complex set of physical and chemical processes under the influence of ionization. The sources of ionization involve the Earth's natural radioactivity and its intensification before earthquakes in seismically active regions, anthropogenic radioactivity caused by nuclear weapon testing and accidents in nuclear power plants and radioactive waste storage, the impact of galactic and solar cosmic rays, and active geophysical experiments using artificial ionization equipment. This approach treats the environment as an open complex system with dissipation, where inherent processes can be considered in the framework of the synergistic approach. We demonstrate the synergy between the evolution of thermal and electromagnetic anomalies in the Earth's atmosphere, ionosphere, and magnetosphere. This makes it possible to determine the direction of the interaction process, which is especially important in applications related to short-term earthquake prediction. That is why the emphasis in this study is on the processes proceeding the final stage of earthquake preparation; the effects of other ionization sources are used to demonstrate that the model is versatile and broadly applicable in geophysics.

  17. My Road to Transform Faulting 1963; Long-Term Precursors to Recent Great Earthquakes

    Science.gov (United States)

    Sykes, L. R.

    2017-12-01

    My road to plate tectonics started serendipitously in 1963 in a remote area of the southeast Pacific when I was studying the propagation of short-period seismic surface waves for my PhD. The earthquakes I used as sources were poorly located. I discovered that my relocated epicenters followed the crest of the East Pacific Rise but then suddenly took a sharp turn to the east at what I interpreted to be a major fracture zone 1000 km long before turning again to the north near 55 degrees south. I noted that earthquakes along that zone only occurred between the two ridge crests, an observation Tuzo Wilson used to develop his hypothesis of transform faulting. Finding a great, unknown fracture zone led me to conclude that work on similar faults that intersect the Mid-Oceanic Ridge System was more important than my study of surface waves. I found similar great faults over the next two years and obtained refined locations of earthquakes along several island arcs. When I was in Fiji and Tonga during 1965 studying deep earthquakes, James Dorman wrote to me about Wilson's paper and I thought about testing his hypothesis. I started work on it the spring of 1966 immediately after I learned about the symmetrical "magic magnetic anomaly profile" across the East Pacific Rise of Pitman and Heirtzler. I quickly obtained earthquake mechanisms that verified the transform hypothesis and its related concepts of seafloor spreading and continental drift. As an undergraduate in the late 1950s, my mentor told me that respectable young earth scientists should not work on vague and false mobilistic concepts like continental drift since continents cannot plow through strong oceanic crust. Hence, until spring 1966, I did not take continental drift seriously. The second part of my presentation involves new evidence from seismology and GPS of what appear to be long-term precursors to a number of great earthquakes of the past decade.

  18. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  19. Prediction of strong ground motion based on scaling law of earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1991-01-01

    In order to predict more practically strong ground motion, it is important to study how to use a semi-empirical method in case of having no appropriate observation records for actual small-events as empirical Green's functions. We propose a prediction procedure using artificially simulated small ground motions as substitute for the actual motions. First, we simulate small-event motion by means of stochastic simulation method proposed by Boore (1983) in considering pass effects such as attenuation, and broadening of waveform envelope empirically in the objective region. Finally, we attempt to predict the strong ground motion due to a future large earthquake (M 7, Δ = 13 km) using the same summation procedure as the empirical Green's function method. We obtained the results that the characteristics of the synthetic motion using M 5 motion were in good agreement with those by the empirical Green's function method. (author)

  20. A Trial for Earthquake Prediction by Precise Monitoring of Deep Ground Water Temperature

    Science.gov (United States)

    Nasuhara, Y.; Otsuki, K.; Yamauchi, T.

    2006-12-01

    A near future large earthquake is estimated to occur off Miyagi prefecture, northeast Japan within 20 years at a probability of about 80 %. In order to predict this earthquake, we have observed groundwater temperature in a borehole at Sendai city 100 km west of the asperity. This borehole penetrates the fault zone of NE-trending active reverse fault, Nagamachi-Rifu fault zone, at 820m depth. Our concept of the ground water observation is that fault zones are natural amplifier of crustal strain, and hence at 820m depth we set a very precise quartz temperature sensor with the resolution of 0.0002 deg. C. We confirmed our observation system to work normally by both the pumping up tests and the systematic temperature changes at different depths. Since the observation started on June 20 in 2004, we found mysterious intermittent temperature fluctuations of two types; one is of a period of 5-10 days and an amplitude of ca. 0.1 deg. C, and the other is of a period of 11-21 days and an amplitude of ca. 0.2 deg. C. Based on the examination using the product of Grashof number and Prantl number, natural convection of water can be occurred in the borehole. However, since these temperature fluctuations are observed only at the depth around 820 m, thus it is likely that they represent the hydrological natures proper to the Nagamachi-Rifu fault zone. It is noteworthy that the small temperature changes correlatable with earth tide are superposed on the long term and large amplitude fluctuations. The amplitude on the days of the full moon and new moon is ca. 0.001 deg. C. The bottoms of these temperature fluctuations always delay about 6 hours relative to peaks of earth tide. This is interpreted as that water in the borehole is sucked into the fault zone on which tensional normal stress acts on the days of the full moon and new moon. The amplitude of the crustal strain by earth tide was measured at ca. 2∗10^-8 strain near our observation site. High frequency temperature noise of

  1. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  2. Multiparameter monitoring of short-term earthquake precursors and its physical basis. Implementation in the Kamchatka region

    Directory of Open Access Journals (Sweden)

    Pulinets Sergey

    2016-01-01

    Full Text Available We apply experimental approach of the multiparameter monitoring of short-term earthquake precursors which reliability was confirmed by the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC model created recently [1]. A key element of the model is the process of Ion induced Nucleation (IIN and formation of cluster ions occurring as a result of the ionization of near surface air layer by radon emanating from the Earth's crust within the earthquake preparation zone. This process is similar to the formation of droplet’s embryos for cloud formation under action of galactic cosmic rays. The consequence of this process is the generation of a number of precursors that can be divided into two groups: a thermal and meteorological, and b electromagnetic and ionospheric. We demonstrate elements of prospective monitoring of some strong earthquakes in Kamchatka region and statistical results for the Chemical potential correction parameter for more than 10 years of observations for earthquakes with M≥6. As some experimental attempt, the data of Kamchatka volcanoes monitoring will be demonstrated.

  3. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  4. Analysis methods for predicting the behaviour of isolators and formulation of simplified models for use in predicting response of structures to earthquake type input

    International Nuclear Information System (INIS)

    2002-01-01

    This report describes the simplified models for predicting the response of high-damping natural rubber bearings (HDNRB) to earthquake ground motions and benchmark problems for assessing the accuracy of finite element analyses in designing base-isolators. (author)

  5. Long-term Postseismic Deformation Following the 1964 Alaska Earthquake

    Science.gov (United States)

    Freymueller, J. T.; Cohen, S. C.; Hreinsdöttir, S.; Suito, H.

    2003-12-01

    Geodetic data provide a rich data set describing the postseismic deformation that followed the 1964 Alaska earthquake (Mw 9.2). This is particularly true for vertical deformation, since tide gauges and leveling surveys provide extensive spatial coverage. Leveling was carried out over all of the major roads of Alaska in 1964-65, and over the last several years we have resurveyed an extensive data set using GPS. Along Turnagain Arm of Cook Inlet, south of Anchorage, a trench-normal profile was surveyed repeatedly over the first decade after the earthquake, and many of these sites have been surveyed with GPS. After using a geoid model to correct for the difference between geometric and orthometric heights, the leveling+GPS surveys reveal up to 1.25 meters of uplift since 1964. The largest uplifts are concentrated in the northern part of the Kenai Peninsula, SW of Turnagain Arm. In some places, steep gradients in the cumulative uplift measurements point to a very shallow source for the deformation. The average 1964-late 1990s uplift rates were substantially higher than the present-day uplift rates, which rarely exceed 10 mm/yr. Both leveling and tide gauge data document a decay in uplift rate over time as the postseismic signal decreases. However, even today the postseismic deformation represents a substantial portion of the total observe deformation signal, illustrating that very long-lived postseismic deformation is an important element of the subduction zone earthquake cycle for the very largest earthquakes. This is in contrast to much smaller events, such as M~8 earthquakes, for which postseismic deformation in many cases decays within a few years. This suggests that the very largest earthquakes may excite different processes than smaller events.

  6. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    International Nuclear Information System (INIS)

    Rotwain, I.; Kuznetsov, I.; De Natale, G.; Peresan, A.; Panza, G.F.

    2003-06-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M ≥ M 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M ≥ M 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The control experiment 'Seismic History' demonstrates the stability of the obtained results and indicates that the algorithm CN can be applied to monitor the preparation of impending earthquakes with M ≥ 3.0 at Mt. Vesuvius. (author)

  7. Rethinking earthquake-related DC-ULF electromagnetic phenomena: towards a physics-based approach

    Directory of Open Access Journals (Sweden)

    Q. Huang

    2011-11-01

    Full Text Available Numerous electromagnetic changes possibly related with earthquakes have been independently reported and have even been attempted to apply to short-term prediction of earthquakes. However, there are active debates on the above issue because the seismogenic process is rather complicated and the studies have been mainly empirical (i.e. a kind of experience-based approach. Thus, a physics-based study would be helpful for understanding earthquake-related electromagnetic phenomena and strengthening their applications. As a potential physics-based approach, I present an integrated research scheme, taking into account the interaction among observation, methodology, and physical model. For simplicity, this work focuses only on the earthquake-related DC-ULF electromagnetic phenomena. The main approach includes the following key problems: (1 how to perform a reliable and appropriate observation with some clear physical quantities; (2 how to develop a robust methodology to reveal weak earthquake-related electromagnetic signals from noisy background; and (3 how to develop plausible physical models based on theoretical analyses and/or laboratory experiments for the explanation of the earthquake-related electromagnetic signals observed in the field conditions.

  8. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    CERN Document Server

    Rotwain, I; Kuznetsov, I V; Panza, G F; Peresan, A

    2003-01-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M >= M sub 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M sub 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M >= M sub 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The co...

  9. Earthquake Hazard Assessment: an Independent Review

    Science.gov (United States)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  10. Modeling Seismic Cycles of Great Megathrust Earthquakes Across the Scales With Focus at Postseismic Phase

    Science.gov (United States)

    Sobolev, Stephan V.; Muldashev, Iskander A.

    2017-12-01

    Subduction is substantially multiscale process where the stresses are built by long-term tectonic motions, modified by sudden jerky deformations during earthquakes, and then restored by following multiple relaxation processes. Here we develop a cross-scale thermomechanical model aimed to simulate the subduction process from 1 min to million years' time scale. The model employs elasticity, nonlinear transient viscous rheology, and rate-and-state friction. It generates spontaneous earthquake sequences and by using an adaptive time step algorithm, recreates the deformation process as observed naturally during the seismic cycle and multiple seismic cycles. The model predicts that viscosity in the mantle wedge drops by more than three orders of magnitude during the great earthquake with a magnitude above 9. As a result, the surface velocities just an hour or day after the earthquake are controlled by viscoelastic relaxation in the several hundred km of mantle landward of the trench and not by the afterslip localized at the fault as is currently believed. Our model replicates centuries-long seismic cycles exhibited by the greatest earthquakes and is consistent with the postseismic surface displacements recorded after the Great Tohoku Earthquake. We demonstrate that there is no contradiction between extremely low mechanical coupling at the subduction megathrust in South Chile inferred from long-term geodynamic models and appearance of the largest earthquakes, like the Great Chile 1960 Earthquake.

  11. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  12. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  13. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  14. The use of radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Ramola, R.C.; Singh, M.; Sandhu, A.S.; Singh, S.; Virk, H.S.

    1990-01-01

    Radon monitoring for earthquake prediction is part of an integral approach since the discovery of coherent and time anomalous radon concentrations prior to, during and after the 1966 Tashkent earthquake. In this paper some studies of groundwater and soil gas radon content in relation to earthquake activities are reviewed. Laboratory experiments and the development of groundwater and soil gas radon monitoring systems are described. In addition, radon monitoring studies conducted at the Guru Nanak Dev University Campus since 1986 are presented in detail. During these studies some anomalous changes in radon concentration were recorded before earthquakes occurred in the region. The anomalous radon increases are independent of meteorological conditions and appear to be caused by strain changes, which precede the earthquake. Anomalous changes in radon concentration before an earthquake suggest that radon monitoring can serve as an additional technique in the earthquake prediction programme in India. (author)

  15. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  16. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  17. The effects of spatially varying earthquake impacts on mood and anxiety symptom treatments among long-term Christchurch residents following the 2010/11 Canterbury earthquakes, New Zealand.

    Science.gov (United States)

    Hogg, Daniel; Kingham, Simon; Wilson, Thomas M; Ardagh, Michael

    2016-09-01

    This study investigates the effects of disruptions to different community environments, community resilience and cumulated felt earthquake intensities on yearly mood and anxiety symptom treatments from the New Zealand Ministry of Health's administrative databases between September 2009 and August 2012. The sample includes 172,284 long-term residents from different Christchurch communities. Living in a better physical environment was associated with lower mood and anxiety treatment rates after the beginning of the Canterbury earthquake sequence whereas an inverse effect could be found for social community environment and community resilience. These results may be confounded by pre-existing patterns, as well as intensified treatment-seeking behaviour and intervention programmes in severely affected areas. Nevertheless, the findings indicate that adverse mental health outcomes can be found in communities with worse physical but stronger social environments or community resilience post-disaster. Also, they do not necessarily follow felt intensities since cumulative earthquake intensity did not show a significant effect. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Published attenuation functions compared to 6/29/1992 Little Skull Mountain earthquake motion

    International Nuclear Information System (INIS)

    Hofmann, R.B.; Ibrahim, A.K.

    1994-01-01

    Several western U.S. strong motion acceleration earthquake attenuation functions are compared to peak accelerations recorded during the 6/29/1992 Little Skull Mountain, Nevada earthquake. The comparison revealed that there are several definitions of site-to-source distance and at least two definitions of peak acceleration in use. Probabilistic seismic hazard analysis (PSHA) codes typically estimate accelerations assuming point sources. The computer code, SEISM 1, was developed for the eastern U.S. where ground acceleration is usually defined in terms of epicentral distance. Formulae whose distance definitions require knowledge of the earthquake fault slip zone dimensions may predict very different near-field accelerations when epicentral distance is used. Approximations to achieve more consistent PSHA results are derived

  19. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning

    Science.gov (United States)

    Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.

    2014-01-01

    Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking in the Los Angeles (LA) basin. These thresholds are used to construct a simple and robust EEW algorithm: to

  20. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  1. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  2. Electrical streaming potential precursors to catastrophic earthquakes in China

    Directory of Open Access Journals (Sweden)

    F. Qian

    1997-06-01

    Full Text Available The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These transient events triggered by tidal forces exhibited a periodic nature and the statistical phenomenon to migrate towards the epicentre about one month before the earthquake. As a result of events, the pore pressure reached its final equilibrium state and was higher than that in the initial state in a large enough section of the fault region. Consequently, local effective shear strength of the material in the fault zone decreased and finally the catastrophic earthquake was induced. Similar phenomena also occurred one month before the M 7.3 Haichen earthquake of 1975. Therefore, a short term earthquake prediction can be made by electrical measurements, which are the kind of geophysical measurements most closely related to pore fluid behaviors of the deep crust.

  3. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  4. Magnitude Estimation for Large Earthquakes from Borehole Recordings

    Science.gov (United States)

    Eshaghi, A.; Tiampo, K. F.; Ghofrani, H.; Atkinson, G.

    2012-12-01

    We present a simple and fast method for magnitude determination technique for earthquake and tsunami early warning systems based on strong ground motion prediction equations (GMPEs) in Japan. This method incorporates borehole strong motion records provided by the Kiban Kyoshin network (KiK-net) stations. We analyzed strong ground motion data from large magnitude earthquakes (5.0 ≤ M ≤ 8.1) with focal depths < 50 km and epicentral distances of up to 400 km from 1996 to 2010. Using both peak ground acceleration (PGA) and peak ground velocity (PGV) we derived GMPEs in Japan. These GMPEs are used as the basis for regional magnitude determination. Predicted magnitudes from PGA values (Mpga) and predicted magnitudes from PGV values (Mpgv) were defined. Mpga and Mpgv strongly correlate with the moment magnitude of the event, provided sufficient records for each event are available. The results show that Mpgv has a smaller standard deviation in comparison to Mpga when compared with the estimated magnitudes and provides a more accurate early assessment of earthquake magnitude. We test this new method to estimate the magnitude of the 2011 Tohoku earthquake and we present the results of this estimation. PGA and PGV from borehole recordings allow us to estimate the magnitude of this event 156 s and 105 s after the earthquake onset, respectively. We demonstrate that the incorporation of borehole strong ground-motion records immediately available after the occurrence of large earthquakes significantly increases the accuracy of earthquake magnitude estimation and the associated improvement in earthquake and tsunami early warning systems performance. Moment magnitude versus predicted magnitude (Mpga and Mpgv).

  5. Strong ground motion of the 2016 Kumamoto earthquake

    Science.gov (United States)

    Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, H.; Morikawa, N.; Fujiwara, H.

    2016-12-01

    The 2016 Kumamoto earthquake that is composed of Mw 6.1 and Mw 7.1 earthquakes respectively occurred in the Kumamoto region at 21:26 on April 14 and 28 hours later at 1:25 on April 16, 2016 (JST). These earthquakes are considered to rupture mainly the Hinagu fault zone for the Mw 6.1 event and the Futagawa fault zone for the Mw 7.1 event, respectively, where the Headquarter for Earthquake Research Promotion performed the long-term evaluation as well as seismic hazard assessment prior to the 2016 Kumamoto earthquake. Strong shakings with seismic intensity 7 in the JMA scale were observed at four times in total: Mashiki town for the Mw 6.1 and Mw 7.1 events, Nishihara village for the Mw 7.1 event, and NIED/KiK-net Mashiki (KMMH16) for the Mw 7.1 event. KiK-net Mashiki (KMMH16) recorded peak ground acceleration more than 1000 cm/s/s, and Nishihara village recorded peak ground velocity more than 250 cm/s. Ground motions were observed wider area for the Mw 7.1 event than the Mw 6.1 event. Peak ground accelerations and peak ground velocities of K-NET/KiK-net stations are consistent with the ground motion prediction equations by Si and Midorikawa (1999). Peak ground velocities at longer distance than 200 km attenuate slowly, which can be attributed to the large Love wave with a dominant period around 10 seconds. 5%-damped pseudo spectral velocity of the Mashiki town shows a peak at period of 1-2 s that exceeds ground motion response of JR Takatori of the 1995 Kobe earthquake and the Kawaguchi town of the 2004 Chuetsu earthquake. 5%-damped pseudo spectral velocity of the Nishihara village shows 350 cm/s peak at period of 3-4 s that is similar to the several stations in Kathmandu basin by Takai et al. (2016) during the 2015 Gorkha earthquake in Nepal. Ground motions at several stations in Oita exceed the ground motion prediction equations due to an earthquake induced by the Mw 7.1 event. Peak ground accelerations of K-NET Yufuin (OIT009) records 90 cm/s/s for the Mw 7

  6. An Ensemble Approach for Improved Short-to-Intermediate-Term Seismic Potential Evaluation

    Science.gov (United States)

    Yu, Huaizhong; Zhu, Qingyong; Zhou, Faren; Tian, Lei; Zhang, Yongxian

    2017-06-01

    Pattern informatics (PI), load/unload response ratio (LURR), state vector (SV), and accelerating moment release (AMR) are four previously unrelated subjects, which are sensitive, in varying ways, to the earthquake's source. Previous studies have indicated that the spatial extent of the stress perturbation caused by an earthquake scales with the moment of the event, allowing us to combine these methods for seismic hazard evaluation. The long-range earthquake forecasting method PI is applied to search for the seismic hotspots and identify the areas where large earthquake could be expected. And the LURR and SV methods are adopted to assess short-to-intermediate-term seismic potential in each of the critical regions derived from the PI hotspots, while the AMR method is used to provide us with asymptotic estimates of time and magnitude of the potential earthquakes. This new approach, by combining the LURR, SV and AMR methods with the choice of identified area of PI hotspots, is devised to augment current techniques for seismic hazard estimation. Using the approach, we tested the strong earthquakes occurred in Yunnan-Sichuan region, China between January 1, 2013 and December 31, 2014. We found that most of the large earthquakes, especially the earthquakes with magnitude greater than 6.0 occurred in the seismic hazard regions predicted. Similar results have been obtained in the prediction of annual earthquake tendency in Chinese mainland in 2014 and 2015. The studies evidenced that the ensemble approach could be a useful tool to detect short-to-intermediate-term precursory information of future large earthquakes.

  7. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  8. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  9. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Science.gov (United States)

    Borcherdt, Roger D.; Gibbs, James F.

    1975-01-01

    The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.

  10. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Energy Technology Data Exchange (ETDEWEB)

    Borcherdt, R.D.; Gibbs, J.F.

    1975-01-01

    The intensity data for the California earthquake of Apr 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan formation is intensity = 2.69 - 1.90 log (distance) (km). For sites on other geologic units, intensity increments, derived with respect to this empirical relation, correlate strongly with the average horizontal spectral amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is intensity increment = 0.27 + 2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan formation, 0.64 for the Great Valley sequence, 0.82 for Santa Clara formation, 1.34 for alluvium, and 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hayward fault.

  11. Predicted Liquefaction in the Greater Oakland and Northern Santa Clara Valley Areas for a Repeat of the 1868 Hayward Earthquake

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2008-12-01

    Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by latest Holocene alluvial fan levee deposits where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906. The liquefaction scenario maps were created with ArcGIS ModelBuilder. Peak ground accelerations first were computed with the new Boore and Atkinson NGA attenuation relation (2008, Earthquake Spectra, 24:1, p. 99-138), using VS30 to account for local site response. Spatial liquefaction probabilities were then estimated using the predicted ground motions

  12. Small discussion of electromagnetic wave anomalies preceding earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    Six brief pieces on various aspects of electromagnetic wave anomalies are presented. They cover: earthquake electromagnetic emanations; the use of magnetic induction information for earthquake forecasting; electromagnetic pulse emissions as pre-earthquake indicators; the use of magnetic sensors to determine medium-wavelength field strength for earthquake prediction purposes; magnetic deviation indicators inside reinforced-concrete buildings; and a discussion of the general physical principles involved.

  13. Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide

    Science.gov (United States)

    Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.

    2017-12-01

    GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.

  14. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  15. NGA-West 2 Equations for predicting PGA, PGV, and 5%-Damped PSA for shallow crustal earthquakes

    Science.gov (United States)

    Boore, David M.; Stewart, Jon P.; Seyhan, Emel; Atkinson, Gail M.

    2013-01-01

    We provide ground-motion prediction equations for computing medians and standard deviations of average horizontal component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. We derived equations for the primary M- and distance-dependence of the IMs after fixing the VS30-based nonlinear site term from a parallel NGA-West 2 study. We then evaluated additional effects using mixed effects residuals analysis, which revealed no trends with source depth over the M range of interest, indistinct Class 1 and 2 event IMs, and basin depth effects that increase and decrease long-period IMs for depths larger and smaller, respectively, than means from regional VS30-depth relations. Our aleatory variability model captures decreasing between-event variability with M, as well as within-event variability that increases or decreases with M depending on period, increases with distance, and decreases for soft sites.

  16. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  17. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Directory of Open Access Journals (Sweden)

    Henny Rydberg

    Full Text Available Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity.We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001.Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  18. A prospective earthquake forecast experiment in the western Pacific

    Science.gov (United States)

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  19. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  20. Long-term effect of early-life stress from earthquake exposure on working memory in adulthood.

    Science.gov (United States)

    Li, Na; Wang, Yumei; Zhao, Xiaochuan; Gao, Yuanyuan; Song, Mei; Yu, Lulu; Wang, Lan; Li, Ning; Chen, Qianqian; Li, Yunpeng; Cai, Jiajia; Wang, Xueyi

    2015-01-01

    The present study aimed to investigate the long-term effect of 1976 Tangshan earthquake exposure in early life on performance of working memory in adulthood. A total of 907 study subjects born and raised in Tangshan were enrolled in this study. They were divided into three groups according to the dates of birth: infant exposure (3-12 months, n=274), prenatal exposure (n=269), and no exposure (born at least 1 year after the earthquake, n=364). The prenatal group was further divided into first, second, and third trimester subgroups based on the timing of exposure during pregnancy. Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised (BVMT-R) were used to measure the performance of working memory. Unconditional logistic regression analysis was used to analyze the influential factors for impaired working memory. The Hopkins Verbal Learning Test-Revised scores did not show significant difference across the three groups. Compared with no exposure group, the BVMT-R scores were slightly lower in the prenatal exposure group and markedly decreased in the infant exposure group. When the BVMT-R scores were analyzed in three subgroups, the results showed that the subjects whose mothers were exposed to earthquake in the second and third trimesters of pregnancy had significantly lower BVMT-R scores compared with those in the first trimester. Education level and early-life earthquake exposure were identified as independent risk factors for reduced performance of visuospatial memory indicated by lower BVMT-R scores. Infant exposure to earthquake-related stress impairs visuospatial memory in adulthood. Fetuses in the middle and late stages of development are more vulnerable to stress-induced damage that consequently results in impaired visuospatial memory. Education and early-life trauma can also influence the performance of working memory in adulthood.

  1. Recent applications for rapid estimation of earthquake shaking and losses with ELER Software

    International Nuclear Information System (INIS)

    Demircioglu, M.B.; Erdik, M.; Kamer, Y.; Sesetyan, K.; Tuzun, C.

    2012-01-01

    A methodology and software package entitled Earthquake Loss Estimation Routine (ELER) was developed for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region. The work was carried out under the Joint Research Activity-3 (JRA3) of the EC FP6 project entitled Network of Research Infrastructures for European Seismology (NERIES). The ELER methodology anticipates: 1) finding of the most likely location of the source of the earthquake using regional seismo-tectonic data base; 2) estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion prediction models, bias-correcting the ground motion estimations with strong ground motion data, if available; 3) estimation of the spatial distribution of site-corrected ground motion parameters using regional geology database using appropriate amplification models; and 4) estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties). The multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships which are coded into ELER. The present paper provides brief information on the methodology of ELER and provides an example application with the recent major earthquake that hit the Van province in the east of Turkey on 23 October 2011 with moment magnitude (Mw) of 7.2. For this earthquake, Kandilli Observatory and Earthquake Research Institute (KOERI) provided almost real time estimations in terms of building damage and casualty distribution using ELER. (author)

  2. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  3. Mortality in the l'aquila (central Italy) earthquake of 6 april 2009.

    Science.gov (United States)

    Alexander, David; Magni, Michele

    2013-01-07

    This paper presents the results of an analysis of data on mortality in the magnitude 6.3 earthquake that struck the central Italian city and province of L'Aquila during the night of 6 April 2009. The aim is to create a profile of the deaths in terms of age, gender, location, behaviour during the tremors, and other aspects. This could help predict the pattern of casualties and priorities for protection in future earthquakes. To establish a basis for analysis, the literature on seismic mortality is surveyed. The conclusions of previous studies are synthesised regarding patterns of mortality, entrapment, survival times, self-protective behaviour, gender and age. These factors are investigated for the data set covering the 308 fatalities in the L'Aquila earthquake, with help from interview data on behavioural factors obtained from 250 survivors. In this data set, there is a strong bias towards victimisation of young people, the elderly and women. Part of this can be explained by geographical factors regarding building performance: the rest of the explanation refers to the vulnerability of the elderly and the relationship between perception and action among female victims, who tend to be more fatalistic than men and thus did not abandon their homes between a major foreshock and the main shock of the earthquake, three hours later. In terms of casualties, earthquakes commonly discriminate against the elderly and women. Age and gender biases need further investigation and should be taken into account in seismic mitigation initiatives.

  4. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  5. Debris-flows scale predictions based on basin spatial parameters calculated from Remote Sensing images in Wenchuan earthquake area

    International Nuclear Information System (INIS)

    Zhang, Huaizhen; Chi, Tianhe; Liu, Tianyue; Wang, Wei; Yang, Lina; Zhao, Yuan; Shao, Jing; Yao, Xiaojing; Fan, Jianrong

    2014-01-01

    Debris flow is a common hazard in the Wenchuan earthquake area. Collapse and Landslide Regions (CLR), caused by earthquakes, could be located from Remote Sensing images. CLR are the direct material source regions for debris flow. The Spatial Distribution of Collapse and Landslide Regions (SDCLR) strongly impact debris-flow formation. In order to depict SDCLR, we referred to Strahler's Hypsometric analysis method and developed 3 functional models to depict SDCLR quantitatively. These models mainly depict SDCLR relative to altitude, basin mouth and main gullies of debris flow. We used the integral of functions as the spatial parameters of SDCLR and these parameters were employed during the process of debris-flows scale predictions. Grouping-occurring debris-flows triggered by the rainstorm, which occurred on September 24th 2008 in Beichuan County, Sichuan province China, were selected to build the empirical equations for debris-flows scale predictions. Given the existing data, only debris-flows runout zone parameters (Max. runout distance L and Lateral width B) were estimated in this paper. The results indicate that the predicted results were more accurate when the spatial parameters were used. Accordingly, we suggest spatial parameters of SDCLR should be considered in the process of debris-flows scale prediction and proposed several strategies to prevent debris flow in the future

  6. Report by the 'Mega-earthquakes and mega-tsunamis' subgroup; Rapport du sous-groupe Sismique 'Megaseismes et megatsunamis'

    Energy Technology Data Exchange (ETDEWEB)

    Friedel, Jacques; Courtillot, Vincent; Dercourt, Jean; Jaupart, Claude; Le Pichon, Xavier; Poirier, Jean-Paul; Salencon, Jean; Tapponnier, Paul; Dautray, Robert; Carpentier, Alain; Taquet, Philippe; Blanchet, Rene; Le Mouel, Jean-Louis [Academie des sciences, 23, quai de Conti, 75006 Paris (France); BARD, Pierre-Yves [Observatoire des sciences de l' Univers de l' universite de Grenoble - OSUG, Universite Joseph Fourier, BP 53, 38041 Grenoble Cedex 9 (France); Bernard, Pascal; Montagner, Jean-Paul; Armijo, Rolando; Shapiro, Nikolai; Tait, Steve [Institut de physique du globe de Paris, 1, rue Jussieu - 75238 Paris cedex 05 (France); Cara, Michel [Ecole et Observatoire des sciences de la Terre de l' universite de Strasbourg - EOST, F-67084 Strasbourg cedex (France); Madariaga, Raul [Ecole normale superieure, 45, rue d' Ulm / 29, rue d' Ulm, F-75230 Paris cedex 05 (France); Pecker, Alain [Academie des technologies, Grand Palais des Champs Elysees - Porte C - Avenue Franklin D. Roosevelt - 75008 Paris (France); Schindele, Francois [CEA/DAM, DIF/DASE/SLDG, 91297 ARPAJON Cedex (France); Douglas, John [BRGM, 3 avenue Claude-Guillemin - BP 36009 - 45060 Orleans Cedex 2 (France)

    2011-06-15

    This report comprises a presentation of scientific data on subduction earthquakes, on tsunamis and on the Tohoku earthquake. It proposes a detailed description of the French situation (in the West Indies, in metropolitan France, and in terms of soil response), and a discussion of social and economic issues (governance, seismic regulation and nuclear safety, para-seismic protection of constructions). The report is completed by other large documents: presentation of data on the Japanese earthquake, discussion on prediction and governance errors in the management of earthquake mitigation in Japan, discussions on tsunami prevention, on needs of research on accelerometers, and on the seismic risk in France

  7. Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand

    Science.gov (United States)

    Nekrasova, A.; Kossobokov, V. G.

    2017-12-01

    We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.

  8. Retrospective evaluation of the five-year and ten-year CSEP-Italy earthquake forecasts

    Directory of Open Access Journals (Sweden)

    Stefan Wiemer

    2010-11-01

    Full Text Available On August 1, 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP launched a prospective and comparative earthquake predictability experiment in Italy. The goal of this CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented 18 five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We have considered here the twelve time-independent earthquake forecasts among this set, and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. We present the results of the tests that measure the consistencies of the forecasts according to past observations. As well as being an evaluation of the time-independent forecasts submitted, this exercise provides insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between robustness of results and experiment duration. We conclude with suggestions for the design of future earthquake predictability experiments.

  9. Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, S.; Ogata, Y.

    2015-12-01

    Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.

  10. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  11. An Earthquake Prediction System Using The Time Series Analyses of Earthquake Property And Crust Motion

    International Nuclear Information System (INIS)

    Takeda, Fumihide; Takeo, Makoto

    2004-01-01

    We have developed a short-term deterministic earthquake (EQ) forecasting system similar to those used for Typhoons and Hurricanes, which has been under a test operation at website http://www.tec21.jp/ since June of 2003. We use the focus and crust displacement data recently opened to the public by Japanese seismograph and global positioning system (GPS) networks, respectively. Our system divides the forecasting area into the five regional areas of Japan, each of which is about 5 deg. by 5 deg. We have found that it can forecast the focus, date of occurrence and magnitude (M) of an impending EQ (whose M is larger than about 6), all within narrow limits. We have two examples to describe the system. One is the 2003/09/26 EQ of M 8 in the Hokkaido area, which is of hindsight. Another is a successful rollout of the most recent forecast on the 2004/05/30 EQ of M 6.7 off coast of the southern Kanto (Tokyo) area

  12. Modelling earth current precursors in earthquake prediction

    Directory of Open Access Journals (Sweden)

    R. Di Maio

    1997-06-01

    Full Text Available This paper deals with the theory of earth current precursors of earthquake. A dilatancy-diffusion-polarization model is proposed to explain the anomalies of the electric potential, which are observed on the ground surface prior to some earthquakes. The electric polarization is believed to be the electrokinetic effect due to the invasion of fluids into new pores, which are opened inside a stressed-dilated rock body. The time and space variation of the distribution of the electric potential in a layered earth as well as in a faulted half-space is studied in detail. It results that the surface response depends on the underground conductivity distribution and on the relative disposition of the measuring dipole with respect to the buried bipole source. A field procedure based on the use of an areal layout of the recording sites is proposed, in order to obtain the most complete information on the time and space evolution of the precursory phenomena in any given seismic region.

  13. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2010-12-01

    The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm.

  14. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  15. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  16. Long-term characteristics of geological conditions in Japan. Pt. 1. Fundamental concept for future's prediction of geological conditions and the subjects

    International Nuclear Information System (INIS)

    Tanaka, Kazuhiro; Chigira, Masahiro.

    1997-01-01

    It is very important to evaluate the long-term stability of geological conditions such as volcanic activity, uplift-subsidence, earthquakes, faulting and sea level change when the long-term safety performance of HLW geological disposal is investigated. We proposed the extrapolation method using the geological date obtained in the geologic time of the last 500 ka to predict the future's tectonic movements in Japan. Furthermore, we extract geological conditions that would affect the long-term safety of HLW geological disposal with regard to direct and indirect radionuclide release scenarios. As a result, it was concluded that volcanic activity and tectonic movements including faulting and uplift-subsidence, should be considered and their surveying system and evaluating method should be developed. (author)

  17. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  18. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  19. Radon anomalies prior to earthquakes (1). Review of previous studies

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    The relationship between radon anomalies and earthquakes has been studied for more than 30 years. However, most of the studies dealt with radon in soil gas or in groundwater. Before the 1995 Hyogoken-Nanbu earthquake, an anomalous increase of atmospheric radon was observed at Kobe Pharmaceutical University. The increase was well fitted with a mathematical model related to earthquake fault dynamics. This paper reports the significance of this observation, reviewing previous studies on radon anomaly before earthquakes. Groundwater/soil radon measurements for earthquake prediction began in 1970's in Japan as well as foreign countries. One of the most famous studies in Japan is groundwater radon anomaly before the 1978 Izu-Oshima-kinkai earthquake. We have recognized the significance of radon in earthquake prediction research, but recently its limitation was also pointed out. Some researchers are looking for a better indicator for precursors; simultaneous measurements of radon and other gases are new trials in recent studies. Contrary to soil/groundwater radon, we have not paid much attention to atmospheric radon before earthquakes. However, it might be possible to detect precursors in atmospheric radon before a large earthquake. In the next issues, we will discuss the details of the anomalous atmospheric radon data observed before the Hyogoken-Nanbu earthquake. (author)

  20. Co-Seismic Effect of the 2011 Japan Earthquake on the Crustal Movement Observation Network of China

    Directory of Open Access Journals (Sweden)

    Shaomin Yang

    2013-01-01

    Full Text Available Great earthquakes introduce measurable co-seismic displacements over regions of hundreds and thousands of kilometers in width, which, if not accounted for, may significantly bias the long-term surface velocity field constrained by GPS observations performed during a period encompassing that event. Here, we first present an estimation of the far-field co-seismic off-sets associated with the 2011 Japan Mw 9.0 earthquake using GPS measurements from the Crustal Movement Observation Network of China (CMONOC in North China. The uncertainties of co-seismic off-set, either at cGPS stations or at campaign sites, are better than 5 - 6 mm on average. We compare three methods to constrain the co-seismic off-sets at the campaign sites in northeastern China 1 interpolating cGPS coseismic offsets, 2 estimating in terms of sparsely sampled time-series, and 3 predicting by using a well-constrained slip model. We show that the interpolation of cGPS co-seismic off-sets onto the campaign sites yield the best co-seismic off-set solution for these sites. The source model gives a consistent prediction based on finite dislocation in a layered spherical Earth, which agrees with the best prediction with discrepancies of 2 - 10 mm for 32 campaign sites. Thus, the co-seismic off-set model prediction is still a reasonable choice if a good coverage cGPS network is not available for a very active region like the Tibetan Plateau in which numerous campaign GPS sites were displaced by the recent large earthquakes.

  1. GIS BASED SYSTEM FOR POST-EARTHQUAKE CRISIS MANAGMENT USING CELLULAR NETWORK

    OpenAIRE

    Raeesi, M.; Sadeghi-Niaraki, A.

    2013-01-01

    Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the d...

  2. Retrospective Evaluation of the Five-Year and Ten-Year CSEP-Italy Earthquake Forecasts

    OpenAIRE

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-01-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten...

  3. Surface rupturing earthquakes repeated in the 300 years along the ISTL active fault system, central Japan

    Science.gov (United States)

    Katsube, Aya; Kondo, Hisao; Kurosawa, Hideki

    2017-06-01

    Surface rupturing earthquakes produced by intraplate active faults generally have long recurrence intervals of a few thousands to tens of thousands of years. We here report the first evidence for an extremely short recurrence interval of 300 years for surface rupturing earthquakes on an intraplate system in Japan. The Kamishiro fault of the Itoigawa-Shizuoka Tectonic Line (ISTL) active fault system generated a Mw 6.2 earthquake in 2014. A paleoseismic trench excavation across the 2014 surface rupture showed the evidence for the 2014 event and two prior paleoearthquakes. The slip of the penultimate earthquake was similar to that of 2014 earthquake, and its timing was constrained to be after A.D. 1645. Judging from the timing, the damaged area, and the amount of slip, the penultimate earthquake most probably corresponds to a historical earthquake in A.D. 1714. The recurrence interval of the two most recent earthquakes is thus extremely short compared with intervals on other active faults known globally. Furthermore, the slip repetition during the last three earthquakes is in accordance with the time-predictable recurrence model rather than the characteristic earthquake model. In addition, the spatial extent of the 2014 surface rupture accords with the distribution of a serpentinite block, suggesting that the relatively low coefficient of friction may account for the unusually frequent earthquakes. These findings would affect long-term forecast of earthquake probability and seismic hazard assessment on active faults.

  4. Prediction of long-term creep curves

    International Nuclear Information System (INIS)

    Oikawa, Hiroshi; Maruyama, Kouichi

    1992-01-01

    This paper aims at discussing how to predict long-term irradiation enhanced creep properties from short-term tests. The predictive method based on the θ concept was examined by using creep data of ferritic steels. The method was successful in predicting creep curves including the tertiary creep stage as well as rupture lifetimes. Some material constants involved in the method are insensitive to the irradiation environment, and their values obtained in thermal creep are applicable to irradiation enhanced creep. The creep mechanisms of most engineering materials definitely change at the athermal yield stress in the non-creep regime. One should be aware that short-term tests must be carried out at stresses lower than the athermal yield stress in order to predict the creep behavior of structural components correctly. (orig.)

  5. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  6. A Comparison of Geodetic and Geologic Rates Prior to Large Strike-Slip Earthquakes: A Diversity of Earthquake-Cycle Behaviors?

    Science.gov (United States)

    Dolan, James F.; Meade, Brendan J.

    2017-12-01

    Comparison of preevent geodetic and geologic rates in three large-magnitude (Mw = 7.6-7.9) strike-slip earthquakes reveals a wide range of behaviors. Specifically, geodetic rates of 26-28 mm/yr for the North Anatolian fault along the 1999 MW = 7.6 Izmit rupture are ˜40% faster than Holocene geologic rates. In contrast, geodetic rates of ˜6-8 mm/yr along the Denali fault prior to the 2002 MW = 7.9 Denali earthquake are only approximately half as fast as the latest Pleistocene-Holocene geologic rate of ˜12 mm/yr. In the third example where a sufficiently long pre-earthquake geodetic time series exists, the geodetic and geologic rates along the 2001 MW = 7.8 Kokoxili rupture on the Kunlun fault are approximately equal at ˜11 mm/yr. These results are not readily explicable with extant earthquake-cycle modeling, suggesting that they may instead be due to some combination of regional kinematic fault interactions, temporal variations in the strength of lithospheric-scale shear zones, and/or variations in local relative plate motion rate. Whatever the exact causes of these variable behaviors, these observations indicate that either the ratio of geodetic to geologic rates before an earthquake may not be diagnostic of the time to the next earthquake, as predicted by many rheologically based geodynamic models of earthquake-cycle behavior, or different behaviors characterize different fault systems in a manner that is not yet understood or predictable.

  7. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  8. Short-term volcano-tectonic earthquake forecasts based on a moving mean recurrence time algorithm: the El Hierro seismo-volcanic crisis experience

    Science.gov (United States)

    García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón

    2016-05-01

    Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.

  9. Long-term geomagnetic changes observed in association with earthquake swarm activities in the Izu Peninsula, Japan

    Energy Technology Data Exchange (ETDEWEB)

    Oshiman, N. [Kyoto University Kyoto (Japan). Disaster Prevention Research Institute; Sasai, Y.; Ishikawa, Y.; Koyama, S. [Tokyo Univ., Tokyo (Japan). Earthquake Research Institute; Honkura, Y. [Tokyo Univ., Tokyo (Japan). Dept. of Earth and Planetary Sciences

    2001-04-01

    Anomalous crustal uplift has continued since 1976 in the Izu Peninsula, Japan. Earthquake swarms have also occurred intermittently off the coast of Ito since 1978. Observations of the total intensity of the geomagnetic field in the peninsula started in 1976 to detect anomalous changes in association with those crustal activities. In particular, a dense continuous observation network using proton magnetometers was established in the northeastern part of the peninsula, immediately after the sea-floor eruption off the coast of Ito in 1989. No remarkable swarm activities were observed there from 1990 to 1992. However, after the occurrence of a small swarm in January 1993, five large swarm activities were observed. At some observation sites, it was observed a remarkable long-term trend in the total geomagnetic field in association with the change in the distribution pattern in the seismicity of the earthquake swarms.

  10. Earthquake Prediction Analysis Based on Empirical Seismic Rate: The M8 Algorithm

    International Nuclear Information System (INIS)

    Molchan, G.; Romashkova, L.

    2010-07-01

    The quality of space-time earthquake prediction is usually characterized by a two-dimensional error diagram (n,τ), where n is the rate of failures-to-predict and τ is the normalized measure of space-time alarm. The most reasonable space measure for analysis of a prediction strategy is the rate of target events λ(dg) in a sub-area dg. In that case the quantity H = 1-(n +τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n,τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M ≥ 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw ≥ 5.5, 1977-2004, and the magnitude range of target events 8.0 ≤ M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm. (author)

  11. Controls on the long term earthquake behavior of an intraplate fault revealed by U-Th and stable isotope analyses of syntectonic calcite veins

    Science.gov (United States)

    Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter

    2017-04-01

    U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure

  12. The limits of earthquake early warning: Timeliness of ground motion estimates

    OpenAIRE

    Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.

    2018-01-01

    The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions aroun...

  13. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  14. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  15. Analysis of the earthquake data and estimation of source parameters in the Kyungsang basin

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong-Moon; Lee, Jun-Hee [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-04-01

    The purpose of the present study is to determine the response spectrum for the Korean Peninsula and estimate the seismic source parameters and analyze and simulate the ground motion adequately from the seismic characteristics of Korean Peninsula and compare this with the real data. The estimated seismic source parameters such as apparent seismic stress drop is somewhat unstable because the data are insufficient. When the instrumental earthquake data were continuously accumulated in the future, the modification of these parameters may be developed. Although equations presented in this report are derived from the limited data, they can be utilized both in seismology and earthquake engineering. Finally, predictive equations may be given in terms of magnitude and hypocentral distances using these parameters. The estimation of the predictive equation constructed from the simulation is the object of further study. 34 refs., 27 figs., 10 tabs. (Author)

  16. Does paleoseismology forecast the historic rates of large earthquakes on the San Andreas fault system?

    Science.gov (United States)

    Biasi, Glenn; Scharer, Katherine M.; Weldon, Ray; Dawson, Timothy E.

    2016-01-01

    The 98-year open interval since the most recent ground-rupturing earthquake in the greater San Andreas boundary fault system would not be predicted by the quasi-periodic recurrence statistics from paleoseismic data. We examine whether the current hiatus could be explained by uncertainties in earthquake dating. Using seven independent paleoseismic records, 100 year intervals may have occurred circa 1150, 1400, and 1700 AD, but they occur in a third or less of sample records drawn at random. A second method sampling from dates conditioned on the existence of a gap of varying length suggests century-long gaps occur 3-10% of the time. A combined record with more sites would lead to lower probabilities. Systematic data over-interpretation is considered an unlikely explanation. Instead some form of non-stationary behaviour seems required, perhaps through long-range fault interaction. Earthquake occurrence since 1000 AD is not inconsistent with long-term cyclicity suggested from long runs of earthquake simulators.

  17. Modelling the elements of country vulnerability to earthquake disasters.

    Science.gov (United States)

    Asef, M R

    2008-09-01

    Earthquakes have probably been the most deadly form of natural disaster in the past century. Diversity of earthquake specifications in terms of magnitude, intensity and frequency at the semicontinental scale has initiated various kinds of disasters at a regional scale. Additionally, diverse characteristics of countries in terms of population size, disaster preparedness, economic strength and building construction development often causes an earthquake of a certain characteristic to have different impacts on the affected region. This research focuses on the appropriate criteria for identifying the severity of major earthquake disasters based on some key observed symptoms. Accordingly, the article presents a methodology for identification and relative quantification of severity of earthquake disasters. This has led to an earthquake disaster vulnerability model at the country scale. Data analysis based on this model suggested a quantitative, comparative and meaningful interpretation of the vulnerability of concerned countries, and successfully explained which countries are more vulnerable to major disasters.

  18. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  19. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  20. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    International Nuclear Information System (INIS)

    O'Brien, G.M.

    1993-01-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p number-sign 1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p number-sign 1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells

  1. Predicting earthquakes by analyzing accelerating precursory seismic activity

    Science.gov (United States)

    Varnes, D.J.

    1989-01-01

    During 11 sequences of earthquakes that in retrospect can be classed as foreshocks, the accelerating rate at which seismic moment is released follows, at least in part, a simple equation. This equation (1) is {Mathematical expression},where {Mathematical expression} is the cumulative sum until time, t, of the square roots of seismic moments of individual foreshocks computed from reported magnitudes;C and n are constants; and tfis a limiting time at which the rate of seismic moment accumulation becomes infinite. The possible time of a major foreshock or main shock, tf,is found by the best fit of equation (1), or its integral, to step-like plots of {Mathematical expression} versus time using successive estimates of tfin linearized regressions until the maximum coefficient of determination, r2,is obtained. Analyzed examples include sequences preceding earthquakes at Cremasta, Greece, 2/5/66; Haicheng, China 2/4/75; Oaxaca, Mexico, 11/29/78; Petatlan, Mexico, 3/14/79; and Central Chile, 3/3/85. In 29 estimates of main-shock time, made as the sequences developed, the errors in 20 were less than one-half and in 9 less than one tenth the time remaining between the time of the last data used and the main shock. Some precursory sequences, or parts of them, yield no solution. Two sequences appear to include in their first parts the aftershocks of a previous event; plots using the integral of equation (1) show that the sequences are easily separable into aftershock and foreshock segments. Synthetic seismic sequences of shocks at equal time intervals were constructed to follow equation (1), using four values of n. In each series the resulting distributions of magnitudes closely follow the linear Gutenberg-Richter relation log N=a-bM, and the product n times b for each series is the same constant. In various forms and for decades, equation (1) has been used successfully to predict failure times of stressed metals and ceramics, landslides in soil and rock slopes, and volcanic

  2. Two critical tests for the Critical Point earthquake

    Science.gov (United States)

    Tzanis, A.; Vallianatos, F.

    2003-04-01

    It has been credibly argued that the earthquake generation process is a critical phenomenon culminating with a large event that corresponds to some critical point. In this view, a great earthquake represents the end of a cycle on its associated fault network and the beginning of a new one. The dynamic organization of the fault network evolves as the cycle progresses and a great earthquake becomes more probable, thereby rendering possible the prediction of the cycle’s end by monitoring the approach of the fault network toward a critical state. This process may be described by a power-law time-to-failure scaling of the cumulative seismic release rate. Observational evidence has confirmed the power-law scaling in many cases and has empirically determined that the critical exponent in the power law is typically of the order n=0.3. There are also two theoretical predictions for the value of the critical exponent. Ben-Zion and Lyakhovsky (Pure appl. geophys., 159, 2385-2412, 2002) give n=1/3. Rundle et al. (Pure appl. geophys., 157, 2165-2182, 2000) show that the power-law activation associated with a spinodal instability is essentially identical to the power-law acceleration of Benioff strain observed prior to earthquakes; in this case n=0.25. More recently, the CP model has gained support from the development of more dependable models of regional seismicity with realistic fault geometry that show accelerating seismicity before large events. Essentially, these models involve stress transfer to the fault network during the cycle such, that the region of accelerating seismicity will scale with the size of the culminating event, as for instance in Bowman and King (Geophys. Res. Let., 38, 4039-4042, 2001). It is thus possible to understand the observed characteristics of distributed accelerating seismicity in terms of a simple process of increasing tectonic stress in a region already subjected to stress inhomogeneities at all scale lengths. Then, the region of

  3. Implications of next generation attenuation ground motion prediction equations for site coefficients used in earthquake resistant design

    Science.gov (United States)

    Borcherdt, Roger D.

    2014-01-01

    Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value  of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein. 

  4. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  5. Analysis of the Source and Ground Motions from the 2017 M8.2 Tehuantepec and M7.1 Puebla Earthquakes

    Science.gov (United States)

    Melgar, D.; Sahakian, V. J.; Perez-Campos, X.; Quintanar, L.; Ramirez-Guzman, L.; Spica, Z.; Espindola, V. H.; Ruiz-Angulo, A.; Cabral-Cano, E.; Baltay, A.; Geng, J.

    2017-12-01

    The September 2017 Tehuantepec and Puebla earthquakes were intra-slab earthquakes that together caused significant damage in broad regions of Mexico, including the states of Oaxaca, Chiapas, Morelos, Puebla, Mexico, and Mexico City. Ground motions in Mexico City have approximately the same angle of incidence from both earthquakes and potentially sample similar paths close to the city. We examine site effects and source terms by analysis of residuals between Ground-Motion Prediction Equations (GMPEs) and observed ground motions for both of these events at stations from the Servicio Sismólogico Nacional, Instituto de Ingeniería, and the Instituto de Geofísica Red del Valle de Mexico networks. GMPEs are a basis for seismic design, but also provide median ground motion values to act as a basis for comparison of individual earthquakes and site responses. First, we invert for finite-fault slip inversions for Tehuantepec with high-rate GPS, static GPS, tide gauge and DART buoy data, and for Puebla with high-rate GPS and strong motion data. Using the distance from the stations with ground motion observations to the derived slip models, we use the GMPEs of Garcia et al. (2005), Zhao et al. (2006), and Abrahamson, Silva and Kamai (2014), to compute predicted values of peak ground acceleration and velocity (PGA and PGV) and response spectral accelerations (SA). Residuals between observed and predicted ground motion parameters are then computed for each recording, and are decomposed into event and site components using a mixed effects regression. We analyze these residuals as an adjustment away from median ground motions in the region to glean information about the earthquake source properties, as well as local site response in and outside of the Mexico City basin. The event and site terms are then compared with available values of stress drop for the two earthquakes, and Vs30 values for the sites, respectively. This analysis is useful in determining which GMPE is most

  6. Cyclic migration of weak earthquakes between Lunigiana earthquake of October 10, 1995 and Reggio Emilia earthquake of October 15, 1996 (Northern Italy)

    Science.gov (United States)

    di Giovambattista, R.; Tyupkin, Yu

    The cyclic migration of weak earthquakes (M 2.2) which occurred during the yearprior to the October 15, 1996 (M = 4.9) Reggio Emilia earthquake isdiscussed in this paper. The onset of this migration was associated with theoccurrence of the October 10, 1995 (M = 4.8) Lunigiana earthquakeabout 90 km southwest from the epicenter of the Reggio Emiliaearthquake. At least three series of earthquakes migrating from theepicentral area of the Lunigiana earthquake in the northeast direction wereobserved. The migration of earthquakes of the first series terminated at adistance of about 30 km from the epicenter of the Reggio Emiliaearthquake. The earthquake migration of the other two series halted atabout 10 km from the Reggio Emilia epicenter. The average rate ofearthquake migration was about 200-300 km/year, while the time ofrecurrence of the observed cycles varied from 68 to 178 days. Weakearthquakes migrated along the transversal fault zones and sometimesjumped from one fault to another. A correlation between the migratingearthquakes and tidal variations is analysed. We discuss the hypothesis thatthe analyzed area is in a state of stress approaching the limit of thelong-term durability of crustal rocks and that the observed cyclic migrationis a result of a combination of a more or less regular evolution of tectonicand tidal variations.

  7. Recovering from the ShakeOut earthquake

    Science.gov (United States)

    Wein, Anne; Johnson, Laurie; Bernknopf, Richard

    2011-01-01

    Recovery from an earthquake like the M7.8 ShakeOut Scenario will be a major endeavor taking many years to complete. Hundreds of Southern California municipalities will be affected; most lack recovery plans or previous disaster experience. To support recovery planning this paper 1) extends the regional ShakeOut Scenario analysis into the recovery period using a recovery model, 2) localizes analyses to identify longer-term impacts and issues in two communities, and 3) considers the regional context of local recovery.Key community insights about preparing for post-disaster recovery include the need to: geographically diversify city procurement; set earthquake mitigation priorities for critical infrastructure (e.g., airport), plan to replace mobile homes with earthquake safety measures, consider post-earthquake redevelopment opportunities ahead of time, and develop post-disaster recovery management and governance structures. This work also showed that communities with minor damages are still sensitive to regional infrastructure damages and their potential long-term impacts on community recovery. This highlights the importance of community and infrastructure resilience strategies as well.

  8. Implication of conjugate faulting in the earthquake brewing and originating process

    Energy Technology Data Exchange (ETDEWEB)

    Jones, L.M. (Massachusetts Inst. of Tech., Cambridge); Deng, Q.; Jiang, P.

    1980-03-01

    The earthquake sequence, precursory and geologo-structural background of the Haicheng, Tangshan, Songpan-Pingwu earthquakes are discussed in this article. All of these earthquakes occurred in a seismic zone controlled by the main boundary faults of an intraplate fault block. However, the fault plane of a main earthquake does not consist of the same faults, but is rather a related secondary fault. They formed altogether a conjugate shearing rupture zone under the action of a regional tectonic stress field. As to the earthquake sequence, the foreshocks and aftershocks might occur on the conjugate fault planes within an epicentral region rather than be limited to the fault plane of a main earthquake, such as the distribution of foreshocks and aftershocks of the Haicheng earthquake. The characteristics of the long-, medium-, and imminent-term earthquake precursory anomalies of the three mentioned earthquakes, especially the character of well-studies anomaly phenomena in electrical resistivity, radon emission, groundwater and animal behavior, have been investigated. The studies of these earthquake precursors show that they were distributed in an area rather more extensive than the epicentral region. Some fault zones in the conjugate fault network usually appeared as distributed belts or concentrated zones of earthquake precursory anomalies, and can be traced in the medium-long term precursory field, but seem more distinct in the short-imminent term precursory anomalous field. These characteristics can be explained by the rupture and sliding originating along the conjugate shear network and the concentration of stress in the regional stress field.

  9. Cooperative earthquake research between the United States and the People's Republic of China

    Energy Technology Data Exchange (ETDEWEB)

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  10. An interdisciplinary approach to study Pre-Earthquake processes

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Taylor, P. T.

    2017-12-01

    We will summarize a multi-year research effort on wide-ranging observations of pre-earthquake processes. Based on space and ground data we present some new results relevant to the existence of pre-earthquake signals. Over the past 15-20 years there has been a major revival of interest in pre-earthquake studies in Japan, Russia, China, EU, Taiwan and elsewhere. Recent large magnitude earthquakes in Asia and Europe have shown the importance of these various studies in the search for earthquake precursors either for forecasting or predictions. Some new results were obtained from modeling of the atmosphere-ionosphere connection and analyses of seismic records (foreshocks /aftershocks), geochemical, electromagnetic, and thermodynamic processes related to stress changes in the lithosphere, along with their statistical and physical validation. This cross - disciplinary approach could make an impact on our further understanding of the physics of earthquakes and the phenomena that precedes their energy release. We also present the potential impact of these interdisciplinary studies to earthquake predictability. A detail summary of our approach and that of several international researchers will be part of this session and will be subsequently published in a new AGU/Wiley volume. This book is part of the Geophysical Monograph series and is intended to show the variety of parameters seismic, atmospheric, geochemical and historical involved is this important field of research and will bring this knowledge and awareness to a broader geosciences community.

  11. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  12. Prediction of accident sequence probabilities in a nuclear power plant due to earthquake events

    International Nuclear Information System (INIS)

    Hudson, J.M.; Collins, J.D.

    1980-01-01

    This paper presents a methodology to predict accident probabilities in nuclear power plants subject to earthquakes. The resulting computer program accesses response data to compute component failure probabilities using fragility functions. Using logical failure definitions for systems, and the calculated component failure probabilities, initiating event and safety system failure probabilities are synthesized. The incorporation of accident sequence expressions allows the calculation of terminal event probabilities. Accident sequences, with their occurrence probabilities, are finally coupled to a specific release category. A unique aspect of the methodology is an analytical procedure for calculating top event probabilities based on the correlated failure of primary events

  13. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    Science.gov (United States)

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  14. Information Theoric Framework for the Earthquake Recurrence Models : Methodica Firma Per Terra Non-Firma

    International Nuclear Information System (INIS)

    Esmer, Oezcan

    2006-01-01

    This paper first evaluates the earthquake prediction method (1999 ) used by US Geological Survey as the lead example and reviews also the recent models. Secondly, points out the ongoing debate on the predictability of earthquake recurrences and lists the main claims of both sides. The traditional methods and the 'frequentist' approach used in determining the earthquake probabilities cannot end the complaints that the earthquakes are unpredictable. It is argued that the prevailing 'crisis' in seismic research corresponds to the Pre-Maxent Age of the current situation. The period of Kuhnian 'Crisis' should give rise to a new paradigm based on the Information-Theoric framework including the inverse problem, Maxent and Bayesian methods. Paper aims to show that the information- theoric methods shall provide the required 'Methodica Firma' for the earthquake prediction models

  15. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  16. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    getting a hit is N%" or "the probability of an earthquake is N%" involves specifying the assumptions made. Different plausible assumptions yield a wide range of estimates. In both seismology and sports, how to better predict future performance remains an important question.

  17. Memory effect in M ≥ 7 earthquakes of Taiwan

    Science.gov (United States)

    Wang, Jeen-Hwa

    2014-07-01

    The M ≥ 7 earthquakes that occurred in the Taiwan region during 1906-2006 are taken to study the possibility of memory effect existing in the sequence of those large earthquakes. Those events are all mainshocks. The fluctuation analysis technique is applied to analyze two sequences in terms of earthquake magnitude and inter-event time represented in the natural time domain. For both magnitude and inter-event time, the calculations are made for three data sets, i.e., the original order data, the reverse-order data, and that of the mean values. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of both magnitude and inter-event time data. In addition, the phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Results lead to a negative answer. Together with all types of information in study, we make a conclusion that the earthquake sequence in study is short-term corrected and thus the short-term memory effect would be operative.

  18. Implementation of short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L; Joensen, A; Giebel, G [and others

    1999-03-01

    This paper will giver a general overview of the results from a EU JOULE funded project (`Implementing short-term prediction at utilities`, JOR3-CT95-0008). Reference will be given to specialised papers where applicable. The goal of the project was to implement wind farm power output prediction systems in operational environments at a number of utilities in Europe. Two models were developed, one by Risoe and one by the Technical University of Denmark (DTU). Both prediction models used HIRLAM predictions from the Danish Meteorological Institute (DMI). (au) EFP-94; EU-JOULE. 11 refs.

  19. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro; Mai, Paul Martin; Yasuda, Tomohiro; Mori, Nobuhito

    2014-01-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  20. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    KAUST Repository

    Goda, Katsuichiro

    2014-09-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  1. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  2. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  3. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    Science.gov (United States)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  4. The potential of continuous, local atomic clock measurements for earthquake prediction and volcanology

    Directory of Open Access Journals (Sweden)

    Bondarescu Mihai

    2015-01-01

    Full Text Available Modern optical atomic clocks along with the optical fiber technology currently being developed can measure the geoid, which is the equipotential surface that extends the mean sea level on continents, to a precision that competes with existing technology. In this proceeding, we point out that atomic clocks have the potential to not only map the sea level surface on continents, but also look at variations of the geoid as a function of time with unprecedented timing resolution. The local time series of the geoid has a plethora of applications. These include potential improvement in the predictions of earthquakes and volcanoes, and closer monitoring of ground uplift in areas where hydraulic fracturing is performed.

  5. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  6. Critical behavior in earthquake energy dissipation

    Science.gov (United States)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  7. Investigation of the relationship between ionospheric foF2 and earthquakes

    Science.gov (United States)

    Karaboga, Tuba; Canyilmaz, Murat; Ozcan, Osman

    2018-04-01

    Variations of the ionospheric F2 region critical frequency (foF2) have been investigated statistically before earthquakes during 1980-2008 periods in Japan area. Ionosonde data was taken from Kokubunji station which is in the earthquake preparation zone for all earthquakes. Standard Deviations and Inter-Quartile Range methods are applied to the foF2 data. It is observed that there are anomalous variations in foF2 before earthquakes. These variations can be regarded as ionospheric precursors and may be used for earthquake prediction.

  8. Depressive symptoms and associated psychosocial factors among adolescent survivors 30 months after 2008 Wenchuan earthquake: A follow-up study

    Directory of Open Access Journals (Sweden)

    Xuliang eShi

    2016-03-01

    Full Text Available AbstractPurpose: This study longitudinally investigated the changes of depressive symptoms among adolescent survivors over two years and a half after the 2008 Wenchuan earthquake in China, as well as the predictive effects of demographic characteristics, earthquake exposure, negative life events, social support and dispositional resilience on the risk of depressive symptoms at two time points after the earthquake.Methods: Participants were 1573 adolescent survivors (720 males and 853 females, mean age at initial survey =15 ± 1.26, whose depressive symptoms were assessed at 6 months (T6m and 30 months (T30m post-earthquake. Data on demographics, earthquake exposure and dispositional resilience were collected at T6m. Negative life events and social support were measured at T6m and 24 months (T24m post-earthquake.Results: The prevalence rates of probable depression, 27.5% at T6m and 27.2% at T30m, maintained relatively stable over time. Female gender was related with higher risk of depressive symptoms at both T6m and T30m, while being only-child could only predict higher risk of depressive symptoms at T30m. Negative life events and social support at T6m, as well as earthquake exposure, were concurrently associated with increased risk of depressive symptoms at T6m, but not associated with the risk of depressive symptoms at T30m, while negative life events and social support at T24m could predict depressive symptoms at T30m, all of which suggested that these variables may have strong but short-term effect on adolescents’ depressive symptoms post-earthquake. Besides, dispositional resilience was evidenced as a relatively stable negative predictor for depressive symptoms.Conclusions: These findings could inform mental health professionals regarding how to screen adolescent survivors at high risk for depression, so as to provide them with timely and appropriate mental health services based on the identified risk and protective factors for depressive

  9. Mental health in L'Aquila after the earthquake

    Directory of Open Access Journals (Sweden)

    Paolo Stratta

    2012-06-01

    Full Text Available INTRODUCTION: In the present work we describe the mental health condition of L'Aquila population in the aftermath of the earthquake in terms of structural, process and outcome perspectives. METHOD: Literature revision of the published reports on the L'Aquila earthquake has been performed. RESULTS: Although important psychological distress has been reported by the population, capacity of resilience can be observed. However if resilient mechanisms intervened in immediate aftermath of the earthquake, important dangers are conceivable in the current medium-long-term perspective due to the long-lasting alterations of day-to-day life and the disruption of social networks that can be well associated with mental health problems. CONCLUSIONS: In a condition such as an earthquake, the immediate physical, medical, and emergency rescue needs must be addressed initially. However training first responders to identify psychological distress symptoms would be important for mental health triage in the field.

  10. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    Science.gov (United States)

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  11. Filling a gap: Public talks about earthquake preparation and the 'Big One'

    Science.gov (United States)

    Reinen, L. A.

    2013-12-01

    Residents of southern California are aware they live in a seismically active area and earthquake drills have trained us to Duck-Cover-Hold On. While many of my acquaintance are familiar with what to do during an earthquake, few have made preparations for living with the aftermath of a large earthquake. The ShakeOut Scenario (Jones et al., USGS Open File Report 2008-1150) describes the physical, social, and economic consequences of a plausible M7.8 earthquake on the southernmost San Andreas Fault. While not detailing an actual event, the ShakeOut Scenario illustrates how individual and community preparation may improve the potential after-affects of a major earthquake in the region. To address the gap between earthquake drills and preparation in my community, for the past several years I have been giving public talks to promote understanding of: the science behind the earthquake predictions; why individual, as well as community, preparation is important; and, ways in which individuals can prepare their home and work environments. The public presentations occur in an array of venues, including elementary school and college classes, a community forum linked with the annual ShakeOut Drill, and local businesses including the local microbrewery. While based on the same fundamental information, each presentation is modified for audience and setting. Assessment of the impact of these talks is primarily anecdotal and includes an increase in the number of venues requesting these talks, repeat invitations, and comments from audience members (sometimes months or years after a talk). I will present elements of these talks, the background information used, and examples of how they have affected change in the earthquake preparedness of audience members. Discussion and suggestions (particularly about effective means of conducting rigorous long-term assessment) are strongly encouraged.

  12. An interdisciplinary approach for earthquake modelling and forecasting

    Science.gov (United States)

    Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.

    2016-12-01

    Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.

  13. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  14. Characteristics of the forerunner field of underground water regime during the Tangshan earthquake and its focal evolution process

    Energy Technology Data Exchange (ETDEWEB)

    Wu, J.; Wang, Y.; Li, S.

    1980-03-01

    On the basis of data obtained from long-term observations of groundwater regime and taking the Tangshan earthquake as an example, the authors proposed first the principles and approaches for recognizing the precursory anomalies of earthquakes and classified the subsurface water regime into underground water, interlayer water and confined water. For the Tangshan earthquake, the forerunner field of underground water is characterized by its spatial distribution (limited to a certain quadrant), its stages of development in time and the synchronism of anomaly-changes during and after a shock. In addition, this study stresses the importance of the principles mentioned above in predicting the space, time, and magnitude of future earthquakes and discusses the relationship between the focal stress field and regional stress field and the indicator to distinguish them. It is suggested that the former develops progressively and its principal axis of compression stress changes in direction just before an earthquake, thus enabling us to divide the focal process into two basically different stages - brewing and originating stages.

  15. Earthquake risk assessment of building structures

    International Nuclear Information System (INIS)

    Ellingwood, Bruce R.

    2001-01-01

    During the past two decades, probabilistic risk analysis tools have been applied to assess the performance of new and existing building structural systems. Structural design and evaluation of buildings and other facilities with regard to their ability to withstand the effects of earthquakes requires special considerations that are not normally a part of such evaluations for other occupancy, service and environmental loads. This paper reviews some of these special considerations, specifically as they pertain to probability-based codified design and reliability-based condition assessment of existing buildings. Difficulties experienced in implementing probability-based limit states design criteria for earthquake are summarized. Comparisons of predicted and observed building damage highlight the limitations of using current deterministic approaches for post-earthquake building condition assessment. The importance of inherent randomness and modeling uncertainty in forecasting building performance is examined through a building fragility assessment of a steel frame with welded connections that was damaged during the Northridge Earthquake of 1994. The prospects for future improvements in earthquake-resistant design procedures based on a more rational probability-based treatment of uncertainty are examined

  16. Dynamic strains for earthquake source characterization

    Science.gov (United States)

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  17. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  18. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    Science.gov (United States)

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  19. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    Directory of Open Access Journals (Sweden)

    Xia Zhang

    Full Text Available BACKGROUND: Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs (N, local health departments (H, and professional rehabilitation volunteers (V which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. METHODS/FINDINGS: 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%. The early intervention group (NHV-E consisted of 298 survivors who received institutional-based rehabilitation (IBR followed by community-based rehabilitation (CBR; the late intervention group (NHV-L was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI. Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3 and sponaneaous recovery (5.03; 95% CI 1.73-8.34. The effect of NHV-E (11.3, 95% CI 9.0-13.7 was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6. It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. CONCLUSION: Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services

  20. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    Science.gov (United States)

    Zhang, Xia; Reinhardt, Jan D; Gosney, James E; Li, Jianan

    2013-01-01

    Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs) (N), local health departments (H), and professional rehabilitation volunteers (V) which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%). The early intervention group (NHV-E) consisted of 298 survivors who received institutional-based rehabilitation (IBR) followed by community-based rehabilitation (CBR); the late intervention group (NHV-L) was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI). Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3) and sponaneaous recovery (5.03; 95% CI 1.73-8.34). The effect of NHV-E (11.3, 95% CI 9.0-13.7) was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6). It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services in China, and international rehabilitation

  1. Marmara Island earthquakes, of 1265 and 1935; Turkey

    Directory of Open Access Journals (Sweden)

    Y. Altınok

    2006-01-01

    Full Text Available The long-term seismicity of the Marmara Sea region in northwestern Turkey is relatively well-recorded. Some large and some of the smaller events are clearly associated with fault zones known to be seismically active, which have distinct morphological expressions and have generated damaging earthquakes before and later. Some less common and moderate size earthquakes have occurred in the vicinity of the Marmara Islands in the west Marmara Sea. This paper presents an extended summary of the most important earthquakes that have occurred in 1265 and 1935 and have since been known as the Marmara Island earthquakes. The informative data and the approaches used have therefore the potential of documenting earthquake ruptures of fault segments and may extend the records kept on earthquakes far before known history, rock falls and abnormal sea waves observed during these events, thus improving hazard evaluations and the fundamental understanding of the process of an earthquake.

  2. Integrated study of geophysical and biological anomalies before earthquakes (seismic and non-seismic), in Austria and Indonesia

    Science.gov (United States)

    Straka, Wolfgang; Assef, Rizkita; Faber, Robert; Ferasyi, Reza

    2015-04-01

    Earthquakes are commonly seen as unpredictable. Even when scientists believe an earthquake is likely, it is still hard to understand the indications observed, as well as their theoretical and practical implications. There is some controversy surrounding the concept of using animals as a precursor of earthquakes. Nonetheless, several institutes at University of Natural Resources and Life Sciences, and Vienna University of Technology, both Vienna, Austria, and Syiah Kuala University, Banda Aceh, as well as Terramath Indonesia, Buleleng, both Indonesia, cooperate in a long-term project, funded by Red Bull Media House, Salzburg, Austria, which aims at getting some decisive step forward from anecdotal to scientific evidence of those interdependencies, and show their possible use in forecasting seismic hazard on a short-term basis. Though no conclusive research has yet been published, an idea in this study is that even if animals do not respond to specific geophysical precursors and with enough notice to enable earthquake forecasting on that basis, they may at least enhance, in conjunction with other indications, the degree of certainty we can get of a prediction of an impending earthquake. In Indonesia, indeed, before the great earthquakes of 2004 and 2005, ominous geophysical as well as biological phenomena occurred (but were realized as precursors only in retrospect). Numerous comparable stories can be told from other times and regions. Nearly 2000 perceptible earthquakes (> M3.5) occur each year in Indonesia. Also, in 2007, the government has launched a program, focused on West Sumatra, for investigating earthquake precursors. Therefore, Indonesia is an excellent target area for a study concerning possible interconnections between geophysical and biological earthquake precursors. Geophysical and atmospheric measurements and behavioral observation of several animal species (elephant, domestic cattle, water buffalo, chicken, rat, catfish) are conducted in three areas

  3. A novel tree-based algorithm to discover seismic patterns in earthquake catalogs

    Science.gov (United States)

    Florido, E.; Asencio-Cortés, G.; Aznarte, J. L.; Rubio-Escudero, C.; Martínez-Álvarez, F.

    2018-06-01

    A novel methodology is introduced in this research study to detect seismic precursors. Based on an existing approach, the new methodology searches for patterns in the historical data. Such patterns may contain statistical or soil dynamics information. It improves the original version in several aspects. First, new seismicity indicators have been used to characterize earthquakes. Second, a machine learning clustering algorithm has been applied in a very flexible way, thus allowing the discovery of new data groupings. Third, a novel search strategy is proposed in order to obtain non-overlapped patterns. And, fourth, arbitrary lengths of patterns are searched for, thus discovering long and short-term behaviors that may influence in the occurrence of medium-large earthquakes. The methodology has been applied to seven different datasets, from three different regions, namely the Iberian Peninsula, Chile and Japan. Reported results show a remarkable improvement with respect to the former version, in terms of all evaluated quality measures. In particular, the number of false positives has decreased and the positive predictive values increased, both of them in a very remarkable manner.

  4. Radon, gas geochemistry, groundwater, and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    King, Chi-Yu [Power Reactor and Nuclear Fuel Development Corp., Tono Geoscience Center, Toki, Gifu (Japan)

    1998-12-31

    Radon monitoring in groundwater, soil air, and atmosphere has been continued in many seismic areas of the world for earthquake-prediction and active-fault studies. Some recent measurements of radon and other geochemical and hydrological parameters have been made for sufficiently long periods, with reliable instruments, and together with measurements of meteorological variables and solid-earth tides. The resultant data are useful in better distinguishing earthquake-related changes from various background noises. Some measurements have been carried out in areas where other geophysical measurements are being made also. Comparative studies of various kinds of geophysical data are helpful in ascertaining the reality of the earthquake-related and fault-related anomalies and in understanding the underlying mechanisms. Spatial anomalies of radon and other terrestrial gasses have been observed for many active faults. Such observations indicate that gas concentrations are very much site dependent, particularly on fault zones where terrestrial fluids may move vertically. Temporal anomalies have been reliably observed before and after some recent earthquakes, including the 1995 Kobe earthquake, and the general pattern of anomaly occurrence remains the same as observed before: They are recorded at only relatively few sensitive sites, which can be at much larger distances than expected from existing earthquake-source models. The sensitivity of a sensitive site is also found to be changeable with time. These results clearly show the inadequacy of the existing dilatancy-fluid diffusion and elastic-dislocation models for earthquake sources to explain earthquake-related geochemical and geophysical changes recorded at large distances. (J.P.N.)

  5. A Case Study of the Bam Earthquake to Establish a Pattern for Earthquake Management in Iran

    Directory of Open Access Journals (Sweden)

    Keramatollah Ziari

    2015-03-01

    Full Text Available The field of crisis management knowledge and expertise is associated with a wide range of fields. Knowledge-based crisis management is a combination of science, art and practice. Iran is an earthquake-prone country. Through years several earthquakes have happened in the country resulting in many human and financial losses. According to scientific standards, the first 24 hours following an earthquake is the most valuable time for saving victims. Yet in the case of Bam only 5% of the victims were rescued within the first 48 hours. The success of disaster management is evaluated in terms of programming, raising public participation, organizing and hiring manpower, and supervising the management process. In this study disaster management is divided into three stages in which different actions are required. The stages and actions are explained in detail. Moreover, features, effects, and losses of the earthquake are described.

  6. What Can We Learn from a Simple Physics-Based Earthquake Simulator?

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2018-03-01

    Physics-based earthquake simulators are becoming a popular tool to investigate on the earthquake occurrence process. So far, the development of earthquake simulators is commonly led by the approach "the more physics, the better". However, this approach may hamper the comprehension of the outcomes of the simulator; in fact, within complex models, it may be difficult to understand which physical parameters are the most relevant to the features of the seismic catalog at which we are interested. For this reason, here, we take an opposite approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple simulator may be more informative than a complex one for some specific scientific objectives, because it is more understandable. Our earthquake simulator has three main components: the first one is a realistic tectonic setting, i.e., a fault data set of California; the second is the application of quantitative laws for earthquake generation on each single fault, and the last is the fault interaction modeling through the Coulomb Failure Function. The analysis of this simple simulator shows that: (1) the short-term clustering can be reproduced by a set of faults with an almost periodic behavior, which interact according to a Coulomb failure function model; (2) a long-term behavior showing supercycles of the seismic activity exists only in a markedly deterministic framework, and quickly disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault; (3) faults that are strongly coupled in terms of Coulomb failure function model are synchronized in time only in a marked deterministic framework, and as before, such a synchronization disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault. Overall, the results show that even in a simple and perfectly known earthquake occurrence world, introducing a small degree of

  7. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    . The usable and realistic ground motion maps for urban areas are generated: - either from the assumption of a "reference earthquake" - or directly, showing values of macroseimic intensity generated by a damaging, real earthquake. In the study, applying deterministic approach, earthquake scenario in macroseismic intensity ("model" earthquake scenario) for the city of Sofia is generated. The deterministic "model" intensity scenario based on assumption of a "reference earthquake" is compared with a scenario based on observed macroseimic effects caused by the damaging 2012 earthquake (MW5.6). The difference between observed (Io) and predicted (Ip) intensities values is analyzed.

  8. Earthquake response spectra for seismic design of nuclear power plants in the UK

    International Nuclear Information System (INIS)

    Bommer, Julian J.; Papaspiliou, Myrto; Price, Warren

    2011-01-01

    Highlights: → Seismic design of UK nuclear power plants usually based on PML response spectra. → We review derivation of PML spectra in terms of earthquake data used and procedure. → The data include errors and represent a small fraction of what is now available. → Seismic design loads in current practice are derived as mean uniform hazard spectra. → The need to capture epistemic uncertainty makes use of single equation indefensible. - Abstract: Earthquake actions for the seismic design of nuclear power plants in the United Kingdom are generally based on spectral shapes anchored to peak ground acceleration (PGA) values obtained from a single predictive equation. Both the spectra and the PGA prediction equation were derived in the 1980s. The technical bases for these formulations of seismic loading are now very dated if compared with the state-of-the-art in this field. Alternative spectral shapes are explored and the options, and the associated benefits and challenges, for generating uniform hazard response spectra instead of fixed shapes anchored to PGA are discussed.

  9. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    Science.gov (United States)

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  10. Saving and Re-building Lives: Determinants of Short-term and Long-term Disaster Relief

    Directory of Open Access Journals (Sweden)

    Geethanjali SELVARETNAM

    2014-11-01

    Full Text Available We analyse both theoretically and empirically, the factors that influence the amount of humanitarian aid received by countries which are struck by natural disasters, particularly distinguishing between immediate disaster relief and long term humanitarian aid. The theoretical model is able to make predictions as well as explain some of the peculiarities in the empirical results. We show that both short and long term humanitarian aid increases with number of people killed, financial loss and level of corruption, while GDP per capita had no effect. More populated countries receive more humanitarian aid. Earthquake, tsunami and drought attract more aid.

  11. Technical NoteEarthquake dates and water level changes in wells in the Eskisehir region, Turkey

    Directory of Open Access Journals (Sweden)

    G. Yuce

    2003-01-01

    Full Text Available Although satisfactory results have yet to be obtained in earthquake prediction, one of the most common indicators of an anomalous precursor is a change in groundwater level in existing wells. Further wells should thus be drilled in unconfined aquifers since these are more susceptible to seismic waves. The Eskisehir region lies in the transition zone between the Aegean extensional domain and the compressible northern Anatolian block. Limnigraphs, installed in 19 exploration wells in the Eskisehir region, recorded pre-seismic, co-seismic and post-seismic level changes during the earthquakes of 17 August Izmit (Mw= 7.4 and 12 November Duzce (Mw= 7.2 1999 that occurred along the North Anatolian Fault Zone. The Izmit and Duzce earthquakes affected groundwater levels, especially in confined aquifers. The aquifer characteristics before and after the earthquakes were unchanged so the aquifer is elastic in its behaviour. Further detailed geo-mechanical investigation of the confined aquifer in the Eskisehir region may improve understanding of earthquake prediction. Keywords: earthquake prediction, Eskisehir, hydrological warning, monitoring groundwater levels

  12. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  13. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  14. Tectonic feedback and the earthquake cycle

    Science.gov (United States)

    Lomnitz, Cinna

    1985-09-01

    The occurrence of cyclical instabilities along plate boundaries at regular intervals suggests that the process of earthquake causation differs in some respects from the model of elastic rebound in its simplest forms. The model of tectonic feedback modifies the concept of this original model in that it provides a physical interaction between the loading rate and the state of strain on the fault. Two examples are developed: (a) Central Chile, and (b) Mexico. The predictions of earthquake hazards for both types of models are compared.

  15. About Block Dynamic Model of Earthquake Source.

    Science.gov (United States)

    Gusev, G. A.; Gufeld, I. L.

    One may state the absence of a progress in the earthquake prediction papers. The short-term prediction (diurnal period, localisation being also predicted) has practical meaning. Failure is due to the absence of the adequate notions about geological medium, particularly, its block structure and especially in the faults. Geological and geophysical monitoring gives the basis for the notion about geological medium as open block dissipative system with limit energy saturation. The variations of the volume stressed state close to critical states are associated with the interaction of the inhomogeneous ascending stream of light gases (helium and hydrogen) with solid phase, which is more expressed in the faults. In the background state small blocks of the fault medium produce the sliding of great blocks in the faults. But for the considerable variations of ascending gas streams the formation of bound chains of small blocks is possible, so that bound state of great blocks may result (earthquake source). Recently using these notions we proposed a dynamical earthquake source model, based on the generalized chain of non-linear bound oscillators of Fermi-Pasta-Ulam type (FPU). The generalization concerns its in homogeneity and different external actions, imitating physical processes in the real source. Earlier weak inhomogeneous approximation without dissipation was considered. Last has permitted to study the FPU return (return to initial state). Probabilistic properties in quasi periodic movement were found. The chain decay problem due to non-linearity and external perturbations was posed. The thresholds and dependence of life- time of the chain are studied. The great fluctuations of life-times are discovered. In the present paper the rigorous consideration of the inhomogeneous chain including the dissipation is considered. For the strong dissipation case, when the oscillation movements are suppressed, specific effects are discovered. For noise action and constantly arising

  16. Discoveries and Controversies in Geodetic Imaging of Deformation Before and After the M=9 Tohoku-oki Earthquake

    Science.gov (United States)

    Wang, K.; Sun, T.; Hino, R.; Iinuma, T.; Tomita, F.; Kido, M.

    2017-12-01

    Numerous observations pertaining to the M=9.0 2011 Tohoku-oki earthquake have led to new understanding of subduction zone earthquakes. By synthesizing published research results and our own findings, we explore what has been learned about fault behavior and Earth rheology from geodetic imaging of crustal deformation before and after the earthquake. Before the earthquake, megathrust locking models based on land-based geodetic observations correctly outlined the along-strike location of the future rupture zone, showing that land-based observations are capable of resolving along-strike variations in locking and creep at wavelengths comparable to distances from the network. But they predicted a locked zone that was much deeper than the actual rupture in 2011. The incorrect definition of the locking pattern in the dip direction demonstrates not only the need for seafloor geodesy but also the importance of modeling interseismic viscoelastic stress relaxation and stress shadowing. The discovery of decade-long accelerated slip downdip of the future rupture zone raises new questions on fault mechanics. After the earthquake, seafloor geodetic discovery of opposing motion offshore provided unambiguous evidence for the dominance of viscoelastic relaxation in short-term postseismic deformation. There is little deep afterslip in the fault area where the decade-long pre-earthquake slip acceleration is observed. The complementary spatial distribution of pre-slip and afterslip calls for new scientific research. However, the near absence of deep afterslip directly downdip of the main rupture is perceived to be controversial because some viscoelastic models do predict large afterslip here, although less than predicted by purely elastic models. We show that the large afterslip in these models is largely an artefact due to the use of a layered Earth model without a subducting slab. The slab acts as an "anchor" in the mantle and retards landward motion following a subduction earthquake

  17. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    Science.gov (United States)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  18. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    Science.gov (United States)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  19. [Comment on Earthquake precursors: Banished forever?] Comment: Unpredictability of earthquakes-Truth or fiction?

    Science.gov (United States)

    Lomnitz, Cinna

    I was delighted to read Alexander Gusev's opinions on what he calls the “unpredictability paradigm” of earthquakes (Eos, February 10, 1998, p. 71). I always enjoy hearing from a good friend in the pages of Eos. I immediately looked up “paradigm” in my Oxford Dictionary and found this: paradigm n 1) set of all the different forms of a word: verb paradigms. 2) Type of something; pattern; model: a paradigm for others to copy.I wonder whether Sasha Gusev actually believes that branding earthquake prediction a “proven nonscience” [Geller, 1997] is a paradigm for others to copy. As for me, I choose to refrain from climbing on board this particular bandwagon for the following reasons.

  20. Ground Motion Prediction for Great Interplate Earthquakes in Kanto Basin Considering Variation of Source Parameters

    Science.gov (United States)

    Sekiguchi, H.; Yoshimi, M.; Horikawa, H.

    2011-12-01

    Broadband ground motions are estimated in the Kanto sedimentary basin which holds Tokyo metropolitan area inside for anticipated great interplate earthquakes along surrounding plate boundaries. Possible scenarios of great earthquakes along Sagami trough are modeled combining characteristic properties of the source area and adequate variation in source parameters in order to evaluate possible ground motion variation due to next Kanto earthquake. South to the rupture area of the 2011 Tohoku earthquake along the Japan trench, we consider possible M8 earthquake. The ground motions are computed with a four-step hybrid technique. We first calculate low-frequency ground motions at the engineering basement. We then calculate higher-frequency ground motions at the same position, and combine the lower- and higher-frequency motions using a matched filter. We finally calculate ground motions at the surface by computing the response of the alluvium-diluvium layers to the combined motions at the engineering basement.

  1. US earthquake observatories: recommendations for a new national network

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    This report is the first attempt by the seismological community to rationalize and optimize the distribution of earthquake observatories across the United States. The main aim is to increase significantly our knowledge of earthquakes and the earth's dynamics by providing access to scientifically more valuable data. Other objectives are to provide a more efficient and cost-effective system of recording and distributing earthquake data and to make as uniform as possible the recording of earthquakes in all states. The central recommendation of the Panel is that the guiding concept be established of a rationalized and integrated seismograph system consisting of regional seismograph networks run for crucial regional research and monitoring purposes in tandem with a carefully designed, but sparser, nationwide network of technologically advanced observatories. Such a national system must be thought of not only in terms of instrumentation but equally in terms of data storage, computer processing, and record availability.

  2. Insights into earthquake hazard map performance from shaking history simulations

    Science.gov (United States)

    Stein, S.; Vanneste, K.; Camelbeeck, T.; Vleminckx, B.

    2017-12-01

    Why recent large earthquakes caused shaking stronger than predicted by earthquake hazard maps is under debate. This issue has two parts. Verification involves how well maps implement probabilistic seismic hazard analysis (PSHA) ("have we built the map right?"). Validation asks how well maps forecast shaking ("have we built the right map?"). We explore how well a map can ideally perform by simulating an area's shaking history and comparing "observed" shaking to that predicted by a map generated for the same parameters. The simulations yield shaking distributions whose mean is consistent with the map, but individual shaking histories show large scatter. Infrequent large earthquakes cause shaking much stronger than mapped, as observed. Hence, PSHA seems internally consistent and can be regarded as verified. Validation is harder because an earthquake history can yield shaking higher or lower than that predicted while being consistent with the hazard map. The scatter decreases for longer observation times because the largest earthquakes and resulting shaking are increasingly likely to have occurred. For the same reason, scatter is much less for the more active plate boundary than for a continental interior. For a continental interior, where the mapped hazard is low, even an M4 event produces exceedances at some sites. Larger earthquakes produce exceedances at more sites. Thus many exceedances result from small earthquakes, but infrequent large ones may cause very large exceedances. However, for a plate boundary, an M6 event produces exceedance at only a few sites, and an M7 produces them in a larger, but still relatively small, portion of the study area. As reality gives only one history, and a real map involves assumptions about more complicated source geometries and occurrence rates, which are unlikely to be exactly correct and thus will contribute additional scatter, it is hard to assess whether misfit between actual shaking and a map — notably higher

  3. Application and analysis of debris-flow early warning system in Wenchuan earthquake-affected area

    Science.gov (United States)

    Liu, D. L.; Zhang, S. J.; Yang, H. J.; Zhao, L. Q.; Jiang, Y. H.; Tang, D.; Leng, X. P.

    2016-02-01

    The activities of debris flow (DF) in the Wenchuan earthquake-affected area significantly increased after the earthquake on 12 May 2008. The safety of the lives and property of local people is threatened by DFs. A physics-based early warning system (EWS) for DF forecasting was developed and applied in this earthquake area. This paper introduces an application of the system in the Wenchuan earthquake-affected area and analyzes the prediction results via a comparison to the DF events triggered by the strong rainfall events reported by the local government. The prediction accuracy and efficiency was first compared with a contribution-factor-based system currently used by the weather bureau of Sichuan province. The storm on 17 August 2012 was used as a case study for this comparison. The comparison shows that the false negative rate and false positive rate of the new system is, respectively, 19 and 21 % lower than the system based on the contribution factors. Consequently, the prediction accuracy is obviously higher than the system based on the contribution factors with a higher operational efficiency. On the invitation of the weather bureau of Sichuan province, the authors upgraded their prediction system of DF by using this new system before the monsoon of Wenchuan earthquake-affected area in 2013. Two prediction cases on 9 July 2013 and 10 July 2014 were chosen to further demonstrate that the new EWS has high stability, efficiency, and prediction accuracy.

  4. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  6. Development of a Low Cost Earthquake Early Warning System in Taiwan

    Science.gov (United States)

    Wu, Y. M.

    2017-12-01

    The National Taiwan University (NTU) developed an earthquake early warning (EEW) system for research purposes using low-cost accelerometers (P-Alert) since 2010. As of 2017, a total of 650 stations have been deployed and configured. The NTU system can provide earthquake information within 15 s of an earthquake occurrence. Thus, this system may provide early warnings for cities located more than 50 km from the epicenter. Additionally, the NTU system also has an onsite alert function that triggers a warning for incoming P-waves greater than a certain magnitude threshold, thus providing a 2-3 s lead time before peak ground acceleration (PGA) for regions close to an epicenter. Detailed shaking maps are produced by the NTU system within one or two minutes after an earthquake. Recently, a new module named ShakeAlarm has been developed. Equipped with real-time acceleration signals and the time-dependent anisotropic attenuation relationship of the PGA, ShakingAlarm can provide an accurate PGA estimation immediately before the arrival of the observed PGA. This unique advantage produces sufficient lead time for hazard assessment and emergency response, which is unavailable for traditional shakemap, which are based on only the PGA observed in real time. The performance of ShakingAlarm was tested with six M > 5.5 inland earthquakes from 2013 to 2016. Taking the 2016 M6.4 Meinong earthquake simulation as an example, the predicted PGA converges to a stable value and produces a predicted shake map and an isocontour map of the predicted PGA within 16 seconds of earthquake occurrence. Compared with traditional regional EEW system, ShakingAlarm can effectively identify possible damage regions and provide valuable early warning information (magnitude and PGA) for risk mitigation.

  7. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  8. Summary of the GK15 ground‐motion prediction equation for horizontal PGA and 5% damped PSA from shallow crustal continental earthquakes

    Science.gov (United States)

    Graizer, Vladimir;; Kalkan, Erol

    2016-01-01

    We present a revised ground‐motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration (PGA) and 5% damped pseudospectral acceleration (PSA) response ordinates of the horizontal component of randomly oriented ground motions to be used for seismic‐hazard analyses and engineering applications. This GMPE is derived from the expanded Next Generation Attenuation (NGA)‐West 1 database (see Data and Resources; Chiou et al., 2008). The revised model includes an anelastic attenuation term as a function of quality factor (Q0) to capture regional differences in far‐source (beyond 150 km) attenuation, and a new frequency‐dependent sedimentary‐basin scaling term as a function of depth to the 1.5  km/s shear‐wave velocity isosurface to improve ground‐motion predictions at sites located on deep sedimentary basins. The new Graizer–Kalkan 2015 (GK15) model, developed to be simple, is applicable for the western United States and other similar shallow crustal continental regions in active tectonic environments for earthquakes with moment magnitudes (M) 5.0–8.0, distances 0–250 km, average shear‐wave velocities in the upper 30 m (VS30) 200–1300  m/s, and spectral periods (T) 0.01–5 s. Our aleatory variability model captures interevent (between‐event) variability, which decreases with magnitude and increases with distance. The mixed‐effect residuals analysis reveals that the GK15 has no trend with respect to the independent predictor parameters. Compared to our 2007–2009 GMPE, the PGA values are very similar, whereas spectral ordinates predicted are larger at T<0.2  s and they are smaller at longer periods.

  9. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  10. Amplitude of foreshocks as a possible seismic precursor to earthquakes

    Science.gov (United States)

    Lindh, A.G.

    1978-01-01

    In recent years, we have made significant progress in being able to recognize the long-range pattern of events that precede large earthquakes. For example, in a recent issue of the Earthquake Information Bulletin, we saw how the pioneering work of S.A. Fedotov of the U.S.S.R in the Kamchatka-Kurile Islands region has been applied worldwide to forecast where large, shallow earthquakes might occur in the next decades. Indeed, such a "seismic gap" off the coast of Alaska was filled by the 1972 Sitka earthquake. Promising results are slowly accumulating from other techniques that suggest that intermediate-term precursors might also be seen: among these are tilt and geomagnetic anomalies and anomalous land uplift. But the crucial point remains that short-term precursors (days to hours) will be needed in many cases if there is to be a significant saving of lives. 

  11. Distribution of incremental static stress caused by earthquakes

    Directory of Open Access Journals (Sweden)

    Y. Y. Kagan

    1994-01-01

    Full Text Available Theoretical calculations, simulations and measurements of rotation of earthquake focal mechanisms suggest that the stress in earthquake focal zones follows the Cauchy distribution which is one of the stable probability distributions (with the value of the exponent α equal to 1. We review the properties of the stable distributions and show that the Cauchy distribution is expected to approximate the stress caused by earthquakes occurring over geologically long intervals of a fault zone development. However, the stress caused by recent earthquakes recorded in instrumental catalogues, should follow symmetric stable distributions with the value of α significantly less than one. This is explained by a fractal distribution of earthquake hypocentres: the dimension of a hypocentre set, ��, is close to zero for short-term earthquake catalogues and asymptotically approaches 2¼ for long-time intervals. We use the Harvard catalogue of seismic moment tensor solutions to investigate the distribution of incremental static stress caused by earthquakes. The stress measured in the focal zone of each event is approximated by stable distributions. In agreement with theoretical considerations, the exponent value of the distribution approaches zero as the time span of an earthquake catalogue (ΔT decreases. For large stress values α increases. We surmise that it is caused by the δ increase for small inter-earthquake distances due to location errors.

  12. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  13. Development of a technique for long-term detection of precursors of strong earthquakes using high-resolution satellite images

    Science.gov (United States)

    Soto-Pinto, C. A.; Arellano-Baeza, A. A.; Ouzounov, D. P.

    2012-12-01

    Among a variety of processes involved in seismic activity, the principal process is the accumulation and relaxation of stress in the crust, which takes place at the depth of tens of kilometers. While the Earth's surface bears at most the indirect sings of the accumulation and relaxation of the crust stress, it has long been understood that there is a strong correspondence between the structure of the underlying crust and the landscape. We assume the structure of the lineaments reflects an internal structure of the Earth's crust, and the variation of the lineament number and arrangement reflects the changes in the stress patterns related to the seismic activity. Contrary to the existing assumptions that lineament structure changes only at the geological timescale, we have found that the much faster seismic activity strongly affects the system of lineaments extracted from the high-resolution multispectral satellite images. Previous studies have shown that accumulation of the stress in the crust previous to a strong earthquake is directly related to the number increment and preferential orientation of lineament configuration present in the satellite images of epicenter zones. This effect increases with the earthquake magnitude and can be observed approximately since one month before. To study in details this effect we have developed a software based on a series of algorithms for automatic detection of lineaments. It was found that the Hough transform implemented after the application of discontinuity detection mechanisms like Canny edge detector or directional filters is the most robust technique for detection and characterization of changes in the lineament patterns related to strong earthquakes, which can be used as a robust long-term precursor of earthquakes indicating regions of strong stress accumulation.

  14. Memory effect in M ≥ 6 earthquakes of South-North Seismic Belt, Mainland China

    Science.gov (United States)

    Wang, Jeen-Hwa

    2013-07-01

    The M ≥ 6 earthquakes occurred in the South-North Seismic Belt, Mainland China, during 1901-2008 are taken to study the possible existence of memory effect in large earthquakes. The fluctuation analysis technique is applied to analyze the sequences of earthquake magnitude and inter-event time represented in the natural time domain. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of earthquake magnitude and inter-event time. The migration of earthquakes in study is taken to discuss the possible correlation between events. The phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Together with all kinds of given information, we conclude that the earthquakes in study is short-term correlated and thus the short-term memory effect would be operative.

  15. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  16. A Multi-parametric Climatological Approach to Study the 2016 Amatrice-Norcia (Central Italy) Earthquake Preparatory Phase

    Science.gov (United States)

    Piscini, Alessandro; De Santis, Angelo; Marchetti, Dedalo; Cianchini, Gianfranco

    2017-10-01

    Based on observations prior to earthquakes, recent theoretical considerations suggest that some geophysical quantities reveal abnormal changes that anticipate moderate and strong earthquakes, within a defined spatial area (the so-called Dobrovolsky area) according to a lithosphere-atmosphere-ionosphere coupling model. One of the possible pre-earthquake effects could be the appearance of some climatological anomalies in the epicentral region, weeks/months before the major earthquakes. In this paper, the period of 2 months preceding the Amatrice-Norcia (Central Italy) earthquake sequence, that started on 24 August 2016 with an M6 earthquake and a few months later produced other two major shocks (i.e. an M5.9 on 26 October and then an M6.5 on 30 October), was analyzed in terms of skin temperature, total column water vapour and total column of ozone, compared with the past 37-year trend. The novelty of the method stands in the way the complete time series is reduced, where also the possible effect of global warming is properly removed. The simultaneous analysis showed the presence of persistent contemporary anomalies in all of the analysed parameters. To validate the technique, a confutation/confirmation analysis was undertaken where these parameters were successfully analyzed in the same months but considering a seismically "calm" year, when significant seismicity was not present. We also extended the analysis to all available years to construct a confusion matrix comparing the occurrence of climatological data anomalies with real seismicity. This work confirms the potentiality of multi parameters in anticipating the occurrence of large earthquakes in Central Italy, thus reinforcing the idea of considering such behaviour an effective tool for an integrated system of future earthquake prediction.

  17. On the reliability of the geomagnetic quake as a short time earthquake's precursor for the Sofia region

    Directory of Open Access Journals (Sweden)

    S. Cht. Mavrodiev

    2004-01-01

    Full Text Available The local 'when' for earthquake prediction is based on the connection between geomagnetic 'quakes' and the next incoming minimum or maximum of tidal gravitational potential. The probability time window for the predicted earthquake is for the tidal minimum approximately ±1 day and for the maximum ±2 days. The preliminary statistic estimation on the basis of distribution of the time difference between occurred and predicted earthquakes for the period 2002-2003 for the Sofia region is given. The possibility for creating a local 'when, where' earthquake research and prediction NETWORK is based on the accurate monitoring of the electromagnetic field with special space and time scales under, on and over the Earth's surface. The periodically upgraded information from seismic hazard maps and other standard geodetic information, as well as other precursory information, is essential.

  18. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    Science.gov (United States)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  19. Estimation of Natural Frequencies During Earthquakes

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A

    1997-01-01

    This paper presents two different recursive prediction error method (RPEM} implementations of multivariate Auto-Regressive Moving- Average (ARMAV) models for identification of a time variant civil engineering structure subject to an earthquake. The two techniques are tested on measurements made...

  20. The impact of the Canterbury earthquakes on prescribing for mental health.

    Science.gov (United States)

    Beaglehole, Ben; Bell, Caroline; Frampton, Christopher; Hamilton, Greg; McKean, Andrew

    2015-08-01

    The aim of this study is to evaluate the impact of the Canterbury earthquakes on the mental health of the local population by examining prescribing patterns of psychotropic medication. Dispensing data from community pharmacies for antidepressants, antipsychotics, anxiolytics and sedatives/hypnotics are routinely recorded in a national database. The close relationship between prescribing and dispensing provides the opportunity to assess prescribing trends for Canterbury compared to national data and therefore examines the longitudinal impact of the earthquakes on prescribing patterns. Short-term increases in the use of anxiolytics and sedatives/hypnotics were observed after the most devastating February 2011 earthquake, but this effect was not sustained. There were no observable effects of the earthquakes on antidepressant or antipsychotic dispensing. Short-term increases in dispensing were only observed for the classes of anxiolytics and sedatives/hypnotics. No sustained changes in dispensing occurred. These findings suggest that long-term detrimental effects on the mental health of the Canterbury population were either not present or have not resulted in increased prescribing of psychotropic medication. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  1. Nurse willingness to report for work in the event of an earthquake in Israel.

    Science.gov (United States)

    Ben Natan, Merav; Nigel, Simon; Yevdayev, Innush; Qadan, Mohamad; Dudkiewicz, Mickey

    2014-10-01

    To examine variables affecting nurse willingness to report for work in the event of an earthquake in Israel and whether this can be predicted through the Theory of Self-Efficacy. The nursing profession has a major role in preparing for earthquakes. Nurse willingness to report to work in the event of an earthquake has never before been examined. Self-administered questionnaires were distributed among a convenience sample of 400 nurses and nursing students in Israel during January-April 2012. High willingness to report to work in the event of an earthquake was declared by 57% of respondents. High perceived self-efficacy, level of knowledge and experience predict willingness to report to work in the event of an earthquake. Multidisciplinary collaboration and support was also cited as a meaningful factor. Perceived self-efficacy, level of knowledge, experience and the support of a multidisciplinary staff affect nurse willingness to report to work in the event of an earthquake. Nurse managers can identify factors that increase nurse willingness to report to work in the event of an earthquake and consequently develop strategies for more efficient management of their nursing workforce. © 2013 John Wiley & Sons Ltd.

  2. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  3. Long-term associative learning predicts verbal short-term memory performance.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2018-02-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term memory system separate from long-term knowledge. Using natural language corpora, we show experimentally and computationally that performance on three widely used measures of short-term memory (digit span, nonword repetition, and sentence recall) can be predicted from simple associative learning operating on the linguistic environment to which a typical child may have been exposed. The findings support the broad view that short-term verbal memory performance reflects the application of long-term language knowledge to the experimental setting.

  4. Stress Regime in the Nepalese Himalaya from Recent Earthquakes.

    Science.gov (United States)

    Pant, M.; Karplus, M. S.; Velasco, A. A.; Nabelek, J.; Kuna, V. M.; Ghosh, A.; Mendoza, M.; Adhikari, L. B.; Sapkota, S. N.; Klemperer, S. L.; Patlan, E.

    2017-12-01

    The two recent earthquakes, April 25, 2015 Mw 7.8 (Gorkha earthquake) and May 12, 2015 Mw 7.2, at the Indo-Eurasian plate margin killed thousands of people and caused billion dollars of property loss. In response to these events, we deployed a dense array of seismometers to record the aftershocks along Gorkha earthquake rupture area. Our network NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake) included 45 different seismic stations (16 short period, 25 broadband, and 4 strong motion sensors) covering a large area from north-central Nepal to south of the Main Frontal Thrust at a spacing of 20 km. The instruments recorded aftershocks from June 2015 to May 2016. We used time domain short term average (STA) and long term average (LTA) algorithms (1/10s and 4/40s) respectively to detect the arrivals and then developed an earthquake catalog containing 9300 aftershocks. We are manually picking the P-wave first motion arrival polarity to develop a catalog of focal mechanisms for the larger magnitude (>M3.0) events with adequate (>10) arrivals. We hope to characterize the seismicity and stress mechanisms of the complex fault geometries in the Nepalese Himalaya and to address the geophysical processes controlling seismic cycles in the Indo-Eurasian plate margin.

  5. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  6. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  7. Our response to the earthquake at Onagawa Nuclear Power Station

    International Nuclear Information System (INIS)

    Hirakawa, Tomoshi

    2008-01-01

    When the Miyagi Offshore earthquake occurred on August 16, 2005, all three units at the Onagawa NPS were shut down automatically according to the Strong Seismic Acceleration' signal. Our inspection after the earthquake confirmed there was no damage to the equipment of the nuclear power plants, but the analysis of the response spectrum observed at the bedrock showed the earthquake had exceeded the 'design-basis earthquake', at certain periods, so that we implemented a review of the seismic safety of plant facilities. In the review, the ground motion of Miyagi Offshore Earthquake which are predicted to occur in the near future were reexamined based on the observation data, and then 'The Ground Motion for Safety Check' surpassing the supposed ground motion of the largest earthquake was established. The seismic safety of plant facilities, important for safety, was assured. At present, No.1 to No.3 units at Onagawa NPS have returned to normal operation. (author)

  8. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    Science.gov (United States)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  9. Ionospheric precursors to large earthquakes: A case study of the 2011 Japanese Tohoku Earthquake

    Science.gov (United States)

    Carter, B. A.; Kellerman, A. C.; Kane, T. A.; Dyson, P. L.; Norman, R.; Zhang, K.

    2013-09-01

    Researchers have reported ionospheric electron distribution abnormalities, such as electron density enhancements and/or depletions, that they claimed were related to forthcoming earthquakes. In this study, the Tohoku earthquake is examined using ionosonde data to establish whether any otherwise unexplained ionospheric anomalies were detected in the days and hours prior to the event. As the choices for the ionospheric baseline are generally different between previous works, three separate baselines for the peak plasma frequency of the F2 layer, foF2, are employed here; the running 30-day median (commonly used in other works), the International Reference Ionosphere (IRI) model and the Thermosphere Ionosphere Electrodynamic General Circulation Model (TIE-GCM). It is demonstrated that the classification of an ionospheric perturbation is heavily reliant on the baseline used, with the 30-day median, the IRI and the TIE-GCM generally underestimating, approximately describing and overestimating the measured foF2, respectively, in the 1-month period leading up to the earthquake. A detailed analysis of the ionospheric variability in the 3 days before the earthquake is then undertaken, where a simultaneous increase in foF2 and the Es layer peak plasma frequency, foEs, relative to the 30-day median was observed within 1 h before the earthquake. A statistical search for similar simultaneous foF2 and foEs increases in 6 years of data revealed that this feature has been observed on many other occasions without related seismic activity. Therefore, it is concluded that one cannot confidently use this type of ionospheric perturbation to predict an impending earthquake. It is suggested that in order to achieve significant progress in our understanding of seismo-ionospheric coupling, better account must be taken of other known sources of ionospheric variability in addition to solar and geomagnetic activity, such as the thermospheric coupling.

  10. Characteristics of Earthquake Ground Motion Attenuation in Korea and Japan

    International Nuclear Information System (INIS)

    Choi, In-Kil; Choun, Young-Sun; Nakajima, Masato; Ohtori, Yasuki; Yun, Kwan-Hee

    2006-01-01

    The characteristics of a ground motion attenuation in Korea and Japan were estimated by using the earthquake ground motions recorded at the equal distance observation station by KMA, K-NET and KiK-net of Korea and Japan. The ground motion attenuation equations proposed for Korea and Japan were evaluated by comparing the predicted value for the Fukuoka earthquake with the observed records. The predicted values from the attenuation equations show a good agreement with the observed records and each other. It can be concluded from this study that the ground motion attenuation equations can be used for the prediction of strong ground motion attenuation and for an evaluation of the attenuation equations proposed for Korea

  11. GOPET: A tool for automated predictions of Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Glatting Karl-Heinz

    2006-03-01

    Full Text Available Abstract Background Vast progress in sequencing projects has called for annotation on a large scale. A Number of methods have been developed to address this challenging task. These methods, however, either apply to specific subsets, or their predictions are not formalised, or they do not provide precise confidence values for their predictions. Description We recently established a learning system for automated annotation, trained with a broad variety of different organisms to predict the standardised annotation terms from Gene Ontology (GO. Now, this method has been made available to the public via our web-service GOPET (Gene Ontology term Prediction and Evaluation Tool. It supplies annotation for sequences of any organism. For each predicted term an appropriate confidence value is provided. The basic method had been developed for predicting molecular function GO-terms. It is now expanded to predict biological process terms. This web service is available via http://genius.embnet.dkfz-heidelberg.de/menu/biounit/open-husar Conclusion Our web service gives experimental researchers as well as the bioinformatics community a valuable sequence annotation device. Additionally, GOPET also provides less significant annotation data which may serve as an extended discovery platform for the user.

  12. Predictors of psychological resilience amongst medical students following major earthquakes.

    Science.gov (United States)

    Carter, Frances; Bell, Caroline; Ali, Anthony; McKenzie, Janice; Boden, Joseph M; Wilkinson, Timothy; Bell, Caroline

    2016-05-06

    To identify predictors of self-reported psychological resilience amongst medical students following major earthquakes in Canterbury in 2010 and 2011. Two hundred and fifty-three medical students from the Christchurch campus, University of Otago, were invited to participate in an electronic survey seven months following the most severe earthquake. Students completed the Connor-Davidson Resilience Scale, the Depression, Anxiety and Stress Scale, the Post-traumatic Disorder Checklist, the Work and Adjustment Scale, and the Eysenck Personality Questionnaire. Likert scales and other questions were also used to assess a range of variables including demographic and historical variables (eg, self-rated resilience prior to the earthquakes), plus the impacts of the earthquakes. The response rate was 78%. Univariate analyses identified multiple variables that were significantly associated with higher resilience. Multiple linear regression analyses produced a fitted model that was able to explain 35% of the variance in resilience scores. The best predictors of higher resilience were: retrospectively-rated personality prior to the earthquakes (higher extroversion and lower neuroticism); higher self-rated resilience prior to the earthquakes; not being exposed to the most severe earthquake; and less psychological distress following the earthquakes. Psychological resilience amongst medical students following major earthquakes was able to be predicted to a moderate extent.

  13. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Science.gov (United States)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  14. Vrancea earthquakes. Courses for specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    Earthquakes in the Carpathian-Pannonian region are confined to the crust, except the Vrancea zone, where earthquakes with focal depth down to 200 Km occur. For example, the ruptured area migrated from 150 km to 180 km (November 10,1940, M w = 7.7) from 90 km to 110 km (March 4, 1977, M w 7.4), from 130 km to 150 km (August 30, 1986, M w = 7.1) and from 70 km to 90 km (May 30, 1990, M w = 6.9) depth. The depth interval between 110 km and 130 km remains not ruptured since 1802, October 26, when it was the strongest earthquake occurred in this part of Central Europe. The magnitude is assumed to be M w = 7.9 - 8.0 and this depth interval is a natural candidate for the next strong Vrancea event. While no country in the world is entirely safe, the lack of capacity to limit the impact of seismic hazards remains a major burden for all countries and while the world has witnessed an exponential increase in human and material losses due to natural disasters given by earthquakes, there is a need to reverse trends in seismic risk mitigation to future events. Main courses for specific actions to mitigate the seismic risk given by strong deep Vrancea earthquakes should be considered as key for development actions: - Early warning system for industrial facilities. Early warning is more than a technological instrument to detect, monitor and submit warnings. It should become part of a management information system for decision-making in the context of national institutional frameworks for disaster management and part of national and local strategies and programmers for risk mitigation; - Prediction program of Vrancea strong earthquakes of short and long term; - Hazard seismic map of Romania. The wrong assessment of the seismic hazard can lead to dramatic situations as those from Bucharest or Kobe. Before the 1977 Vrancea earthquake, the city of Bucharest was designed to intensity I = VII (MMI) and the real intensity was I = IX1/2-X (MMI); - Seismic microzonation of large populated

  15. Earthquake early warning using P-waves that appear after initial S-waves

    Science.gov (United States)

    Kodera, Y.

    2017-12-01

    As measures for underprediction for large earthquakes with finite faults and overprediction for multiple simultaneous earthquakes, Hoshiba (2013), Hoshiba and Aoki (2015), and Kodera et al. (2016) proposed earthquake early warning (EEW) methods that directly predict ground motion by computing the wave propagation of observed ground motion. These methods are expected to predict ground motion with a high accuracy even for complicated scenarios because these methods do not need source parameter estimation. On the other hand, there is room for improvement in their rapidity because they predict strong motion prediction mainly based on the observation of S-waves and do not explicitly use P-wave information available before the S-waves. In this research, we propose a real-time P-wave detector to incorporate P-wave information into these wavefield-estimation approaches. P-waves within a few seconds from the P-onsets are commonly used in many existing EEW methods. In addition, we focus on P-waves that may appear in the later part of seismic waves. Kurahashi and Irikura (2013) mentioned that P-waves radiated from strong motion generation areas (SMGAs) were recognizable after S-waves of the initial rupture point in the 2011 off the Pacific coast of Tohoku earthquake (Mw 9.0) (the Tohoku-oki earthquake). Detecting these P-waves would enhance the rapidity of prediction for the peak ground motion generated by SMGAs. We constructed a real-time P-wave detector that uses a polarity analysis. Using acceleration records in boreholes of KiK-net (band-pass filtered around 0.5-10 Hz with site amplification correction), the P-wave detector performed the principal component analysis with a sliding window of 4 s and calculated P-filter values (e.g. Ross and Ben-Zion, 2014). The application to the Tohoku-oki earthquake (Mw 9.0) showed that (1) peaks of P-filter that corresponded to SMGAs appeared in several stations located near SMGAs and (2) real-time seismic intensities (Kunugi et al

  16. Addressing earthquakes strong ground motion issues at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, I.G.; Silva, W.J.; Stark, C.L.; Jackson, S.; Smith, R.P.

    1991-01-01

    In the course of reassessing seismic hazards at the Idaho National Engineering Laboratory (INEL), several key issues have been raised concerning the effects of the earthquake source and site geology on potential strong ground motions that might be generated by a large earthquake. The design earthquake for the INEL is an approximate moment magnitude (M w ) 7 event that may occur on the southern portion of the Lemhi fault, a Basin and Range normal fault that is located on the northwestern boundary of the eastern Snake River Plain and the INEL, within 10 to 27 km of several major facilities. Because the locations of these facilities place them at close distances to a large earthquake and generally along strike of the causative fault, the effects of source rupture dynamics (e.g., directivity) could be critical in enhancing potential ground shaking at the INEL. An additional source issue that has been addressed is the value of stress drop to use in ground motion predictions. In terms of site geology, it has been questioned whether the interbedded volcanic stratigraphy beneath the ESRP and the INEL attenuates ground motions to a greater degree than a typical rock site in the western US. These three issues have been investigated employing a stochastic ground motion methodology which incorporates the Band-Limited-White-Noise source model for both a point source and finite fault, random vibration theory and an equivalent linear approach to model soil response

  17. Geoethical suggestions for reducing risk of next (not only strong) earthquakes

    Science.gov (United States)

    Nemec, Vaclav

    2013-04-01

    Three relatively recent examples of earthquakes can be used as a background for suggesting geoethical views into any prediction accompanied by a risk analysis. ĹAquila earthquake (Italy - 2009): ĹAquila was largely destroyed by earthquakes in 1315, 1319, 1452, 1461, 1501, 1646, 1703 (until that time altogether about 3000 victims) and 1786 (about 6000 victims of this event only). The city was rebuilt and remained stable until October 2008, when tremors began again. From January 1 through April 5, 2009, additional 304 tremors were reported. When after measuring increased levels of radon emitted from the ground a local citizen (for many years working for the Italian National Institute of Astrophysics) predicted a major earthquake on Italian television, he was accused of being alarmist. Italy's National Commission for Prediction and Prevention of Major Risks met in L'Aquila for one hour on March 31, 2009, without really evaluating and characterising the risks that were present. On April 6 a 6.3 magnitude earthquake struck Aquila and nearby towns, killing 309 people and injuring more than 1,500. The quake also destroyed roughly 20,000 buildings, temporarily displacing another 65,000 people. In July 2010, prosecutor Fabio Picuti charged the Commission members with manslaughter and negligence for failing to warn the public of the impending risk. Many international organizations joined the chorus of criticism wrongly interpreting the accusation and sentence at the first stage as a problem of impossibility to predict earthquakes. - The Eyjafjallajokull volcano eruption (Iceland - 2010) is a reminder that in our globalized, interconnected world because of the increased sensibility of the new technology even a relatively small natural disaster may cause unexpected range of problems. - Earthquake and tsunami (Japan - 2011) - the most powerful known earthquake ever to have hit Japan on March 11. Whereas the proper earthquake with the magnitude of 9.0 has caused minimum of

  18. Earthquakes trigger the loss of groundwater biodiversity

    Science.gov (United States)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  19. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable.

    Science.gov (United States)

    Huang, Yihe; Ellsworth, William L; Beroza, Gregory C

    2017-08-01

    Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses.

  20. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  1. Earthquake occurrence as stochastic event: (1) theoretical models

    Energy Technology Data Exchange (ETDEWEB)

    Basili, A.; Basili, M.; Cagnetti, V.; Colombino, A.; Jorio, V.M.; Mosiello, R.; Norelli, F.; Pacilio, N.; Polinari, D.

    1977-01-01

    The present article intends liaisoning the stochastic approach in the description of earthquake processes suggested by Lomnitz with the experimental evidence reached by Schenkova that the time distribution of some earthquake occurrence is better described by a Negative Bionomial distribution than by a Poisson distribution. The final purpose of the stochastic approach might be a kind of new way for labeling a given area in terms of seismic risk.

  2. LASSCI2009.2: layered earthquake rupture forecast model for central Italy, submitted to the CSEP project

    Directory of Open Access Journals (Sweden)

    Francesco Visini

    2010-11-01

    Full Text Available The Collaboratory for the Study of Earthquake Predictability (CSEP selected Italy as a testing region for probabilistic earthquake forecast models in October, 2008. The model we have submitted for the two medium-term forecast periods of 5 and 10 years (from 2009 is a time-dependent, geologically based earthquake rupture forecast that is defined for central Italy only (11-15˚ E; 41-45˚ N. The model took into account three separate layers of seismogenic sources: background seismicity; seismotectonic provinces; and individual faults that can produce major earthquakes (seismogenic boxes. For CSEP testing purposes, the background seismicity layer covered a range of magnitudes from 5.0 to 5.3 and the seismicity rates were obtained by truncated Gutenberg-Richter relationships for cells centered on the CSEP grid. Then the seismotectonic provinces layer returned the expected rates of medium-to-large earthquakes following a traditional Cornell-type approach. Finally, for the seismogenic boxes layer, the rates were based on the geometry and kinematics of the faults that different earthquake recurrence models have been assigned to, ranging from pure Gutenberg-Richter behavior to characteristic events, with the intermediate behavior named as the hybrid model. The results for different magnitude ranges highlight the contribution of each of the three layers to the total computation. The expected rates for M >6.0 on April 1, 2009 (thus computed before the L'Aquila, 2009, MW= 6.3 earthquake are of particular interest. They showed local maxima in the two seismogenic-box sources of Paganica and Sulmona, one of which was activated by the L'Aquila earthquake of April 6, 2009. Earthquake rates as of August 1, 2009, (now under test also showed a maximum close to the Sulmona source for MW ~6.5; significant seismicity rates (10-4 to 10-3 in 5 years for destructive events (magnitude up to 7.0 were located in other individual sources identified as being capable of such

  3. Thermal Radiation Anomalies Associated with Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

    2017-01-01

    Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

  4. Identification of radon anomalies related to earthquakes

    International Nuclear Information System (INIS)

    Ozdas, M.; Inceoglu, F.; Rahman, C.; Yaprak, G.

    2009-01-01

    Put of many proposed earthquake precursors, temporal radon variation in soil is classified as one of a few promising geochemical signals that may be used for earthquake prediction. However, to use radon variation in soil gas as a reliable earthquake precursor, it must be realized that radon changes are controlled not only by deeper phenomena such as earthquake, but they are also controlled by meteorological parameters such as precipitation, barometric pressure, air temperature and etc. Further studies are required to differentiate the changes in the measured radon concentration caused by tectonic disturbances from the meteorological parameters. In the current study, temporal radon variations in soil gas along active faults in Alasehir of Gediz Graben Systems have been continuously monitored by LR-115 nuclear track detectors for two years. Additionally, the meteorological parameters such as barometric pressure, rainfall and air temperature at the monitoring site have been observed during the same period. Accordingly, regression analysis have been applied to the collected data to identify the radon anomalies due to the seismic activities from those of meteorological conditions.

  5. LONG-TERM CULTURAL IMPACTS OF DISASTER DECISION-MAKING: The Case of Post Earthquake Reconstruction in Marathwada, India

    Directory of Open Access Journals (Sweden)

    Rohit Jigyasu

    2013-11-01

    Full Text Available Emergency situations are special since they present decision makers with a context that is characterized by extraordinary constraints on resources, need for urgency of actions and a critical psychosocial state that is markedly different than the normal situation. However, actions taken under these extraordinary situations can have a profound bearing on the longterm recovery of the community and its heritage. This paper considers the critical aspects of decision-making in emergency situations that need to be considered for sustainable longterm recovery of cultural heritage. It is difficult however to judge these essential considerations beforehand without evaluating the impacts of these decisions in hindsight. These considerations will be illustrated through case study of post-earthquake reconstruction in Marathwada in India by assessing the long-term impact of rehabilitation policies formulated in the immediate aftermath of the earthquake. Patterns of adaptation and change in these areas demonstrate how small decisions taken during emergency can have wider socio-economic and physical implications. These cases will also show the importance of understanding the local context, especially with respect to local vulnerabilities as well as capacities, skills and resources while making decisions. These would also emphasize the necessity and ways of engaging various stakeholders, especially the local community, not as passive recipients but as important actors in the decision-making process. These considerations are significant for conservation professionals making decisions during emergencies, especially with regards to immediate protection, repairs and long-term recovery of cultural heritage, while we largely remain at the periphery of the reconstruction process.

  6. Multi-Directional Seismic Assessment of Historical Masonry Buildings by Means of Macro-Element Modelling: Application to a Building Damaged during the L’Aquila Earthquake (Italy

    Directory of Open Access Journals (Sweden)

    Francesco Cannizzaro

    2017-11-01

    Full Text Available The experience of the recent earthquakes in Italy caused a shocking impact in terms of loss of human life and damage in buildings. In particular, when it comes to ancient constructions, their cultural and historical value overlaps with the economic and social one. Among the historical structures, churches have been the object of several studies which identified the main characteristics of the seismic response and the most probable collapse mechanisms. More rarely, academic studies have been devoted to ancient palaces, since they often exhibit irregular and complicated arrangement of the resisting elements, which makes their response very difficult to predict. In this paper, a palace located in L’Aquila, severely damaged by the seismic event of 2009 is the object of an accurate study. A historical reconstruction of the past strengthening interventions as well as a detailed geometric relief is performed to implement detailed numerical models of the structure. Both global and local models are considered and static nonlinear analyses are performed considering the influence of the input direction on the seismic vulnerability of the building. The damage pattern predicted by the numerical models is compared with that observed after the earthquake. The seismic vulnerability assessments are performed in terms of ultimate peak ground acceleration (PGA using capacity curves and the Italian code spectrum. The results are compared in terms of ultimate ductility demand evaluated performing nonlinear dynamic analyses considering the actual registered seismic input of L’Aquila earthquake.

  7. Earthquake Signal Visible in GRACE Data

    Science.gov (United States)

    2005-01-01

    [figure removed for brevity, see original site] Figure1 This figure shows the effect of the December 2004 great Sumatra earthquake on the Earth's gravity field as observed by GRACE. The signal is expressed in terms of the relative acceleration of the two GRACE satellites, in this case a few nanometers per second squared, or about 1 billionth of the acceleration we experience everyday at the Earth's surface.GRACE observations show comparable signals in the region of the earthquake. Other natural variations are also apparent in the expected places, whereas no other significant change would be expected in the region of the earthquake GRACE, twin satellites launched in March 2002, are making detailed measurements of Earth's gravity field which will lead to discoveries about gravity and Earth's natural systems. These discoveries could have far-reaching benefits to society and the world's population.

  8. Global risk of big earthquakes has not recently increased.

    Science.gov (United States)

    Shearer, Peter M; Stark, Philip B

    2012-01-17

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.

  9. Earthquakes: no danger for deep underground nuclear waste repositories

    International Nuclear Information System (INIS)

    2010-03-01

    On the Earth, the continental plates are steadily moving. Principally at the plate boundaries such shifts produce stresses which are released in form of earthquakes. The highest the built-up energy, the more violent will be the shaking. Earthquakes accompany mankind from very ancient times on and they disturb the population. Till now nobody is able to predict where and when they will take place. But on the Earth there are regions where, due to their geological situation, the occurrence of earthquakes is more probable than elsewhere. The impact of a very strong earthquake on the structures at the Earth surface depends on several factors. Besides the ground structure, the density of buildings, construction style and materials used play an important role. Construction-related technical measures can improve the safety of buildings and, together with a correct behaviour of the people concerned, save many lives. Earthquakes are well known in Switzerland. Here, the stresses are due to the collision of the African and European continental plates that created the Alps. The impact of earthquake is more limited in the underground than at the Earth surface. There is no danger for deep underground repositories

  10. Impact of earthquake source complexity and land elevation data resolution on tsunami hazard assessment and fatality estimation

    Science.gov (United States)

    Muhammad, Ario; Goda, Katsuichiro

    2018-03-01

    This study investigates the impact of model complexity in source characterization and digital elevation model (DEM) resolution on the accuracy of tsunami hazard assessment and fatality estimation through a case study in Padang, Indonesia. Two types of earthquake source models, i.e. complex and uniform slip models, are adopted by considering three resolutions of DEMs, i.e. 150 m, 50 m, and 10 m. For each of the three grid resolutions, 300 complex source models are generated using new statistical prediction models of earthquake source parameters developed from extensive finite-fault models of past subduction earthquakes, whilst 100 uniform slip models are constructed with variable fault geometry without slip heterogeneity. The results highlight that significant changes to tsunami hazard and fatality estimates are observed with regard to earthquake source complexity and grid resolution. Coarse resolution (i.e. 150 m) leads to inaccurate tsunami hazard prediction and fatality estimation, whilst 50-m and 10-m resolutions produce similar results. However, velocity and momentum flux are sensitive to the grid resolution and hence, at least 10-m grid resolution needs to be implemented when considering flow-based parameters for tsunami hazard and risk assessments. In addition, the results indicate that the tsunami hazard parameters and fatality number are more sensitive to the complexity of earthquake source characterization than the grid resolution. Thus, the uniform models are not recommended for probabilistic tsunami hazard and risk assessments. Finally, the findings confirm that uncertainties of tsunami hazard level and fatality in terms of depth, velocity and momentum flux can be captured and visualized through the complex source modeling approach. From tsunami risk management perspectives, this indeed creates big data, which are useful for making effective and robust decisions.

  11. Accounting for orphaned aftershocks in the earthquake background rate

    Science.gov (United States)

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  12. Understanding dynamic friction through spontaneously evolving laboratory earthquakes.

    Science.gov (United States)

    Rubino, V; Rosakis, A J; Lapusta, N

    2017-06-29

    Friction plays a key role in how ruptures unzip faults in the Earth's crust and release waves that cause destructive shaking. Yet dynamic friction evolution is one of the biggest uncertainties in earthquake science. Here we report on novel measurements of evolving local friction during spontaneously developing mini-earthquakes in the laboratory, enabled by our ultrahigh speed full-field imaging technique. The technique captures the evolution of displacements, velocities and stresses of dynamic ruptures, whose rupture speed range from sub-Rayleigh to supershear. The observed friction has complex evolution, featuring initial velocity strengthening followed by substantial velocity weakening. Our measurements are consistent with rate-and-state friction formulations supplemented with flash heating but not with widely used slip-weakening friction laws. This study develops a new approach for measuring local evolution of dynamic friction and has important implications for understanding earthquake hazard since laws governing frictional resistance of faults are vital ingredients in physically-based predictive models of the earthquake source.

  13. a Collaborative Cyberinfrastructure for Earthquake Seismology

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  14. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  15. Design and investigation of a continuous radon monitoring network for earthquake precursory process in Great Tehran

    International Nuclear Information System (INIS)

    Negarestani, A.; Namvaran, M.; Hashemi, S.M.; Shahpasandzadeh, M.; Fatemi, S.J.; Alavi, S.A.; Mokhtari, M.

    2014-01-01

    Earthquakes usually occur after some preliminary anomalies in the physical and chemical characteristics of environment and earth interior. Construction of the models which can explain these anomalies, prompt scientists to monitor geophysical and geochemical characteristics in the seismic areas for earthquake prediction. A review of studies has been done so far, denoted that radon gas shows more sensitivity than other geo-gas as a precursor. Based on previous researches, radon is a short-term precursor of earthquake from time point of view. There are equal experimental equations about the relation between earthquake magnitude and its effective distance on radon concentration variations. In this work, an algorithm based on Dobrovolsky equation (D=10 0.43M ) with defining the Expectation and Investigation circles for great Tehran has been used. Radon concentration was measured with RAD7 detector in the more than 40 springs. Concentration of radon in spring, spring discharge, water temperature and the closeness of spring location to active faults, have been considered as the significant factors to select the best spring for a radon continuous monitoring site implementation. According to these factors, thirteen springs have been selected as follow: Bayjan, Mahallat-Hotel, Avaj, Aala, Larijan, Delir, Lavij, Ramsar, Semnan, Lavieh, Legahi, Kooteh-Koomeh and Sarein. (author)

  16. Updated earthquake catalogue for seismic hazard analysis in Pakistan

    Science.gov (United States)

    Khan, Sarfraz; Waseem, Muhammad; Khan, Muhammad Asif; Ahmed, Waqas

    2018-03-01

    A reliable and homogenized earthquake catalogue is essential for seismic hazard assessment in any area. This article describes the compilation and processing of an updated earthquake catalogue for Pakistan. The earthquake catalogue compiled in this study for the region (quadrangle bounded by the geographical limits 40-83° N and 20-40° E) includes 36,563 earthquake events, which are reported as 4.0-8.3 moment magnitude (M W) and span from 25 AD to 2016. Relationships are developed between the moment magnitude and body, and surface wave magnitude scales to unify the catalogue in terms of magnitude M W. The catalogue includes earthquakes from Pakistan and neighbouring countries to minimize the effects of geopolitical boundaries in seismic hazard assessment studies. Earthquakes reported by local and international agencies as well as individual catalogues are included. The proposed catalogue is further used to obtain magnitude of completeness after removal of dependent events by using four different algorithms. Finally, seismicity parameters of the seismic sources are reported, and recommendations are made for seismic hazard assessment studies in Pakistan.

  17. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    Science.gov (United States)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  18. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  19. What Googling Trends Tell Us About Public Interest in Earthquakes

    Science.gov (United States)

    Tan, Y. J.; Maharjan, R.

    2017-12-01

    Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.

  20. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  1. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    1997-01-01

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modeling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial Valley earthquake in California (U .S .A.). The results of the study indicate that while all three approaches can successfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  2. Non-Stationary Modelling and Simulation of Near-Source Earthquake Ground Motion

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Kirkegaard, Poul Henning; Fouskitakis, G. N.

    This paper is concerned with modelling and simulation of near-source earthquake ground motion. Recent studies have revealed that these motions show heavy non-stationary behaviour with very low frequencies dominating parts of the earthquake sequence. Modelling and simulation of this behaviour...... by an epicentral distance of 16 km and measured during the 1979 Imperial valley earthquake in California (USA). The results of the study indicate that while all three approaches can succesfully predict near-source ground motions, the Neural Network based one gives somewhat poorer simulation results....

  3. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    volume increment for a given slip increment becomes larger. A juction with past accumulated slip ??0 is a strong barrier to earthquakes with maximum slip um < 2 (P/µ u0 = u0/50. As slip continues to occur elsewhere in the fault system, a stress concentration will grow at the old junction. A fresh fracture may occur in the stress concentration, establishing a new triple junction, and allowing continuity of slip in the fault system. The fresh fracture could provide the instability needed to explain earthquakes. Perhaps a small fraction (on the order of P/µ of the surface that slips in any earthquake is fresh fracture. Stress drop occurs only on this small fraction of the rupture surface, the asperities. Strain change in the asperities is on the order of P/µ. Therefore this model predicts average strais change in an earthquake to be on the order of (P/µ2 = 0.0001, as is observed.

  4. Presentation and Analysis of a Worldwide Database of Earthquake-Induced Landslide Inventories : Earthquake-Induced Landslide Inventories

    NARCIS (Netherlands)

    Tanyas, Hakan; Van Westen, Cees J.; Allstadt, Kate E.; Anna Nowicki Jessee, M.; Görüm, Tolga; Jibson, Randall W.; Godt, Jonathan W.; Sato, Hiroshi P.; Schmitt, Robert G.; Marc, Odin; Hovius, Niels

    2017-01-01

    Earthquake‐induced landslide (EQIL) inventories are essential tools to extend our knowledge of the relationship between earthquakes and the landslides they can trigger. Regrettably, such inventories are difficult to generate and therefore scarce, and the available ones differ in terms of their

  5. INTEGRATED FRAMEWORK FOR ENHANCING EARTHQUAKE RISK MITIGATION DECISIONS

    Directory of Open Access Journals (Sweden)

    Temitope Egbelakin

    2015-12-01

    Full Text Available The increasing scale of losses from earthquake disasters has reinforced the need for property owners to become proactive in seismic risk reduction programs. However, despite advancement in seismic design methods and legislative frameworks, building owners are found unwilling or lack motivation to adopt adequate mitigation measures that will reduce their vulnerability to earthquake disasters. Various theories and empirical findings have been used to explain the adoption of protective behaviours including seismic mitigation decisions, but their application has been inadequate to enhance building owners’ protective decisions. A holistic framework that incorporates the motivational orientations of decision-making, coupled with the social, cultural, economic, regulatory, institutional and political realms of earthquake risk mitigation to enhance building owners’ decisions to voluntarily implement adequate mitigation measures, is proposed. This framework attempts to address any multi-disciplinary barriers that exist in earthquake disaster management, by ensuring that stakeholders involved in seismic mitigation decisions work together to foster seismic rehabilitation of EPBs, as well as illuminate strategies that will initiate, promote and sustain the adoption of long-term earthquake mitigation. .

  6. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  7. Vrancea earthquakes. Specific actions to mitigate seismic risk

    International Nuclear Information System (INIS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru

    2005-01-01

    natural disasters given by earthquakes, there is a need to reverse trends in seismic risk mitigation to future events. Main courses of specific action to mitigate the seismic risks from strong deep Vrancea earthquakes should be considered as key to future development projects, including: - Early warning system for industrial facilities; - Short and long term prediction program of strong Vrancea earthquakes; - Seismic hazard map of Romania; - Seismic microzonation of large populated cities; - Shake map; - Seismic tomography of dams for avoiding disasters. The quality of life and the security of infrastructure (including human services, civil and industrial structures, financial infrastructure, information transmission and processing systems) in every nation are increasingly vulnerable to disasters caused by events that have geological, atmospheric, hydrologic, and technological origins. As UN Secretary General Kofi Annan pointed out, 'Building a culture of prevention is not easy. While the costs of prevention have to be paid in the present, its benefits lie in a distant future'. In other words: Prevention pays off. This may not always become apparent immediately, but, in the long run, the benefits from prevention measures will always outweigh their costs by far. Romania is an earthquake prone area and these main specific actions are really contributing to seismic risk mitigation. These specific actions are provided for in Law nr. 372/March 18,2004 -'The National Program of Seismic Risk Management'. (authors)

  8. A review on remotely sensed land surface temperature anomaly as an earthquake precursor

    Science.gov (United States)

    Bhardwaj, Anshuman; Singh, Shaktiman; Sam, Lydia; Joshi, P. K.; Bhardwaj, Akanksha; Martín-Torres, F. Javier; Kumar, Rajesh

    2017-12-01

    The low predictability of earthquakes and the high uncertainty associated with their forecasts make earthquakes one of the worst natural calamities, capable of causing instant loss of life and property. Here, we discuss the studies reporting the observed anomalies in the satellite-derived Land Surface Temperature (LST) before an earthquake. We compile the conclusions of these studies and evaluate the use of remotely sensed LST anomalies as precursors of earthquakes. The arrival times and the amplitudes of the anomalies vary widely, thus making it difficult to consider them as universal markers to issue earthquake warnings. Based on the randomness in the observations of these precursors, we support employing a global-scale monitoring system to detect statistically robust anomalous geophysical signals prior to earthquakes before considering them as definite precursors.

  9. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  10. Short-Term Wind Speed Prediction Using EEMD-LSSVM Model

    Directory of Open Access Journals (Sweden)

    Aiqing Kang

    2017-01-01

    Full Text Available Hybrid Ensemble Empirical Mode Decomposition (EEMD and Least Square Support Vector Machine (LSSVM is proposed to improve short-term wind speed forecasting precision. The EEMD is firstly utilized to decompose the original wind speed time series into a set of subseries. Then the LSSVM models are established to forecast these subseries. Partial autocorrelation function is adopted to analyze the inner relationships between the historical wind speed series in order to determine input variables of LSSVM models for prediction of every subseries. Finally, the superposition principle is employed to sum the predicted values of every subseries as the final wind speed prediction. The performance of hybrid model is evaluated based on six metrics. Compared with LSSVM, Back Propagation Neural Networks (BP, Auto-Regressive Integrated Moving Average (ARIMA, combination of Empirical Mode Decomposition (EMD with LSSVM, and hybrid EEMD with ARIMA models, the wind speed forecasting results show that the proposed hybrid model outperforms these models in terms of six metrics. Furthermore, the scatter diagrams of predicted versus actual wind speed and histograms of prediction errors are presented to verify the superiority of the hybrid model in short-term wind speed prediction.

  11. Seismomagnetic effects from the long-awaited 28 September 2004 M 6.0 parkfield earthquake

    Science.gov (United States)

    Johnston, M.J.S.; Sasai, Y.; Egbert, G.D.; Mueller, R.J.

    2006-01-01

    Precise measurements of local magnetic fields have been obtained with a differentially connected array of seven synchronized proton magnetometers located along 60 km of the locked-to-creeping transition region of the San Andreas fault at Parkfield, California, since 1976. The M 6.0 Parkfield earthquake on 28 September 2004, occurred within this array and generated coseismic magnetic field changes of between 0.2 and 0.5 nT at five sites in the network. No preseismic magnetic field changes exceeding background noise levels are apparent in the magnetic data during the month, week, and days before the earthquake (or expected in light of the absence of measurable precursive deformation, seismicity, or pore pressure changes). Observations of electric and magnetic fields from 0.01 to 20 Hz are also made at one site near the end of the earthquake rupture and corrected for common-mode signals from the ionosphere/magnetosphere using a second site some 115 km to the northwest along the fault. These magnetic data show no indications of unusual noise before the earthquake in the ULF band (0.01-20 Hz) as suggested may have preceded the 1989 ML 7.1 Loma Prieta earthquake. Nor do we see electric field changes similar to those suggested to occur before earthquakes of this magnitude from data in Greece. Uniform and variable slip piezomagnetic models of the earthquake, derived from strain, displacement, and seismic data, generate magnetic field perturbations that are consistent with those observed by the magnetometer array. A higher rate of longer-term magnetic field change, consistent with increased loading in the region, is apparent since 1993. This accompanied an increased rate of secular shear strain observed on a two-color EDM network and a small network of borehole tensor strainmeters and increased seismicity dominated by three M 4.5-5 earthquakes roughly a year apart in 1992, 1993, and 1994. Models incorporating all of these data indicate increased slip at depth in the region

  12. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  13. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    Science.gov (United States)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    worship. The waveforms recorded could be promptly used to determine ground-shaking parameters, like peak ground acceleration/velocity/displacement, Arias and Housner intensity, that could be all used to create, few seconds after a strong earthquakes, shaking maps at urban scale. These shaking maps could allow to quickly identify areas of the town center that have had the greatest earthquake resentment. When a strong seismic event occur, the beginning of the ground motion observed at the site could be used to predict the ensuing ground motion at the same site and so to realize a short term earthquake early warning system. The data acquired after a moderate magnitude earthquake, would provide valuable information for the detail seismic microzonation of the area based on direct earthquake shaking observations rather than from a model-based or indirect methods. In this work, we evaluate the feasibility and effectiveness of such seismic network taking in to account both technological, scientific and economic issues. For this purpose, we have simulated the creation of a MEMS based urban seismic network in a medium size city. For the selected town, taking into account the instrumental specifics, the array geometry and the environmental noise, we investigated the ability of the planned network to detect and measure earthquakes of different magnitude generated from realistic near seismogentic sources.

  14. Simplified design and evaluation of liquid storage tanks relative to earthquake loading

    Energy Technology Data Exchange (ETDEWEB)

    Poole, A.B.

    1994-06-01

    A summary of earthquake-induced damage in liquid storage tanks is provided. The general analysis steps for dynamic response of fluid-filled tanks subject to horizontal ground excitation are discussed. This work will provide major attention to the understanding of observed tank-failure modes. These modes are quite diverse in nature, but many of the commonly appearing patterns are believed to be shell buckling. A generalized and simple-to-apply shell loading will be developed using Fluegge shell theory. The input to this simplified analysis will be horizontal ground acceleration and tank shell form parameters. A dimensionless parameter will be developed and used in predictions of buckling resulting from earthquake-imposed loads. This prediction method will be applied to various tank designs that have failed during major earthquakes and during shaker table tests. Tanks that have not failed will also be reviewed. A simplified approach will be discussed for early design and evaluation of tank shell parameters and materials to provide a high confidence of low probability of failure during earthquakes.

  15. QuakeUp: An advanced tool for a network-based Earthquake Early Warning system

    Science.gov (United States)

    Zollo, Aldo; Colombelli, Simona; Caruso, Alessandro; Elia, Luca; Brondi, Piero; Emolo, Antonio; Festa, Gaetano; Martino, Claudio; Picozzi, Matteo

    2017-04-01

    predicted P-wave amplitude at a dense spatial grid, including the nodes of the accelerometer/velocimeter array deployed in the earthquake source area. Within times of the order of ten seconds from the earthquake origin, the information about the area where moderate to strong ground shaking is expected to occur, can be sent to inner and outer sites, allowing the activation of emergency measurements to protect people , secure industrial facilities and optimize the site resilience after the disaster. Depending of the network density and spatial source coverage, this method naturally accounts for effects related to the earthquake rupture extent (e.g. source directivity) and spatial variability of strong ground motion related to crustal wave propagation and site amplification. In QuakeUp, the P-wave parameters are continuously measured, using progressively expanded P-wave time windows, and providing evolutionary and reliable estimates of the ground shaking distribution, especially in the case of very large events. Furthermore, to minimize the S-wave contamination on the P-wave signal portion, an efficient algorithm, based on the real-time polarization analysis of the three-component seismogram, for the automatic detection of the S-wave arrival time has been included. The final output of QuakeUp will be an automatic alert message that is transmitted to sites to be secured during the earthquake emergency. The message contains all relevant information about the expected potential damage at the site and the time available for security actions (lead-time) after the warning. A global view of the system performance during and after the event (in play-back mode) is obtained through an end-user visual display, where the most relevant pieces of information will be displayed and updated as soon as new data are available. The software platform Quake-Up is essentially aimed at improving the reliability and the accuracy in terms of parameter estimation, minimizing the uncertainties in the

  16. Long-term seismic observations along Myanmar-Sunda subduction margin: insights for 2004 M w > 9.0 earthquake

    Science.gov (United States)

    Khan, Prosanta Kumar; Banerjee, Jayashree; Shamim, Sk; Mohanty, Manoranjan

    2018-03-01

    The present study investigates the temporal variation of few seismic parameters between the Myanmar (Zone I), Andaman-Nicobar-Northwest Sumatra (Zone II), Southeast Sumatra-West Indonesia (Zone III) and East Indonesia (Zone IV) converging boundaries in reference to the generation of 26 December 2004 M w > 9.0 off-Sumatra mega-earthquake event. The four segments are distinguished based on tectonics parameters, distinct geological locations, great earthquake occurrences, and the Wadati-Benioff zone characteristics. Two important seismic parameters such as seismic energy and b values are computed over a time-window of 6-month period during the entire 1976-2013 period for these segments. The b values show a constant decrease in Zones II, III, and IV, whereas the Zone I does not show any such pattern prior to the 2004 mega-event. The release of seismic energy was also gradually decreasing in Zones II and III till the 2004 event, and little similar pattern was also noted in Zone IV. This distinct observation might be indicating that the stress accumulation was dominant near the Sumatra-Java area located towards southeast of Zone II and northwest of Zone III. The released strain energy during the 2004 event was subsequently migrated towards north, rupturing 1300 km of the boundary between the Northwest Sumatra and the North Andaman. The occurrence of 2004 mega-event was apparently concealed behind the long-term seismic quiescence existing near the Sumatra and Nicobar margin. A systematic study of the patterns of seismic energy release and b values, and the long-term observation of collective behaviour of the margin tectonics might have had given clues to the possibility of the 2004 mega-event.

  17. Limiting the effects of earthquakes on gravitational-wave interferometers

    Science.gov (United States)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  18. Limiting the effects of earthquakes on gravitational-wave interferometers

    International Nuclear Information System (INIS)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Donovan, Fred; Buchanan, Christopher; Coughlin, Eric; Fee, Jeremy; Guy, Michelle; Gabbard, Hunter; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period. (paper)

  19. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    Science.gov (United States)

    Perrone, Loredana; De Santis, Angelo; Abbattista, Cristoforo; Alfonsi, Lucilla; Amoruso, Leonardo; Carbone, Marianna; Cesaroni, Claudio; Cianchini, Gianfranco; De Franceschi, Giorgiana; De Santis, Anna; Di Giovambattista, Rita; Marchetti, Dedalo; Pavòn-Carrasco, Francisco J.; Piscini, Alessandro; Spogli, Luca; Santoro, Francesca

    2018-03-01

    Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003-2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h'Es, foEs) and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  20. Ionospheric anomalies detected by ionosonde and possibly related to crustal earthquakes in Greece

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2018-03-01

    Full Text Available Ionosonde data and crustal earthquakes with magnitude M ≥ 6.0 observed in Greece during the 2003–2015 period were examined to check if the relationships obtained earlier between precursory ionospheric anomalies and earthquakes in Japan and central Italy are also valid for Greek earthquakes. The ionospheric anomalies are identified on the observed variations of the sporadic E-layer parameters (h′Es, foEs and foF2 at the ionospheric station of Athens. The corresponding empirical relationships between the seismo-ionospheric disturbances and the earthquake magnitude and the epicentral distance are obtained and found to be similar to those previously published for other case studies. The large lead times found for the ionospheric anomalies occurrence may confirm a rather long earthquake preparation period. The possibility of using the relationships obtained for earthquake prediction is finally discussed.

  1. From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2009-12-01

    Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.

  2. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  3. Direct methods of soil-structure interaction analysis for earthquake loadings (V)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Choi, J. S.; Lee, J. J.; Park, D. U. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1997-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI' has been developed. The computer program has been verified using a free-field site-response problem. Post-correlation analysis for the Hualien FVT after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analyses for three LSST structures (Hualien, Lotung and Tepsco structure) have also been performed and compared with the measured data.

  4. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J B; Kim, D S; Choi, J S; Kwon, K C; Kim, Y J; Lee, H J; Kim, S B; Kim, D K [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  5. Direct methods of soil-structure interaction analysis for earthquake loadings (IV)

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, J. B.; Kim, D. S.; Choi, J. S.; Kwon, K. C.; Kim, Y. J.; Lee, H. J.; Kim, S. B.; Kim, D. K. [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1996-07-15

    Methodologies of SSI analysis for earthquake loadings have been reviewed. Based on the finite element method incorporating infinite element technique for the unbounded exterior region, a computer program for the nonlinear seismic analysis named as 'KIESSI-QK' has been developed. The computer program has been verified using a free-field site-response problem. The Hualien FVT stochastic finite element analysis after backfill and the blind prediction of earthquake responses have been carried out utilizing the developed computer program. The earthquake response analysis for the LSST structure has also been performed and compared with the measured data.

  6. Testing the Predictive Power of Coulomb Stress on Aftershock Sequences

    Science.gov (United States)

    Woessner, J.; Lombardi, A.; Werner, M. J.; Marzocchi, W.

    2009-12-01

    Empirical and statistical models of clustered seismicity are usually strongly stochastic and perceived to be uninformative in their forecasts, since only marginal distributions are used, such as the Omori-Utsu and Gutenberg-Richter laws. In contrast, so-called physics-based aftershock models, based on seismic rate changes calculated from Coulomb stress changes and rate-and-state friction, make more specific predictions: anisotropic stress shadows and multiplicative rate changes. We test the predictive power of models based on Coulomb stress changes against statistical models, including the popular Short Term Earthquake Probabilities and Epidemic-Type Aftershock Sequences models: We score and compare retrospective forecasts on the aftershock sequences of the 1992 Landers, USA, the 1997 Colfiorito, Italy, and the 2008 Selfoss, Iceland, earthquakes. To quantify predictability, we use likelihood-based metrics that test the consistency of the forecasts with the data, including modified and existing tests used in prospective forecast experiments within the Collaboratory for the Study of Earthquake Predictability (CSEP). Our results indicate that a statistical model performs best. Moreover, two Coulomb model classes seem unable to compete: Models based on deterministic Coulomb stress changes calculated from a given fault-slip model, and those based on fixed receiver faults. One model of Coulomb stress changes does perform well and sometimes outperforms the statistical models, but its predictive information is diluted, because of uncertainties included in the fault-slip model. Our results suggest that models based on Coulomb stress changes need to incorporate stochastic features that represent model and data uncertainty.

  7. Global observation of Omori-law decay in the rate of triggered earthquakes

    Science.gov (United States)

    Parsons, T.

    2001-12-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the 1999 shocks in Turkey and the 2001 events in El Salvador. In this study, earthquakes with M greater than 7.0 from the Harvard CMT catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near the main shocks are associated with calculated shear stress increases, while ~39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, triggered earthquakes obey an Omori-law rate decay that lasts between ~7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main-shock centroid. Earthquakes triggered by smaller quakes (foreshocks) also obey Omori's law, which is one of the few time-predictable patterns evident in the global occurrence of earthquakes. These observations indicate that earthquake probability calculations which include interactions from previous shocks should incorporate a transient Omori-law decay with time. In addition, a very simple model using the observed global rate change with time and spatial distribution of triggered earthquakes can be applied to immediately assess the likelihood of triggered earthquakes following large events, and can be in place until more sophisticated analyses are conducted.

  8. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  9. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  10. Posttraumatic stress disorder: a serious post-earthquake complication.

    Science.gov (United States)

    Farooqui, Mudassir; Quadri, Syed A; Suriya, Sajid S; Khan, Muhammad Adnan; Ovais, Muhammad; Sohail, Zohaib; Shoaib, Samra; Tohid, Hassaan; Hassan, Muhammad

    2017-01-01

    Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD) occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD. A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD), posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  11. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  12. Is earthquake rate in south Iceland modified by seasonal loading?

    Science.gov (United States)

    Jonsson, S.; Aoki, Y.; Drouin, V.

    2017-12-01

    Several temporarily varying processes have the potential of modifying the rate of earthquakes in the south Iceland seismic zone, one of the two most active seismic zones in Iceland. These include solid earth tides, seasonal meteorological effects and influence from passing weather systems, and variations in snow and glacier loads. In this study we investigate the influence these processes may have on crustal stresses and stressing rates in the seismic zone and assess whether they appear to be influencing the earthquake rate. While historical earthquakes in the south Iceland have preferentially occurred in early summer, this tendency is less clear for small earthquakes. The local earthquake catalogue (going back to 1991, magnitude of completeness M6+ earthquakes, which occurred in June 2000 and May 2008. Standard Reasenberg earthquake declustering or more involved model independent stochastic declustering algorithms are not capable of fully eliminating the aftershocks from the catalogue. We therefore inspected the catalogue for the time period before 2000 and it shows limited seasonal tendency in earthquake occurrence. Our preliminary results show no clear correlation between earthquake rates and short-term stressing variations induced from solid earth tides or passing storms. Seasonal meteorological effects also appear to be too small to influence the earthquake activity. Snow and glacier load variations induce significant vertical motions in the area with peak loading occurring in Spring (April-May) and maximum unloading in Fall (Sept.-Oct.). Early summer occurrence of historical earthquakes therefore correlates with early unloading rather than with the peak unloading or unloading rate, which appears to indicate limited influence of this seasonal process on the earthquake activity.

  13. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    Science.gov (United States)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  14. Earthquake Safety Tips in the Classroom

    Science.gov (United States)

    Melo, M. O.; Maciel, B. A. P. C.; Neto, R. P.; Hartmann, R. P.; Marques, G.; Gonçalves, M.; Rocha, F. L.; Silveira, G. M.

    2014-12-01

    The catastrophes induced by earthquakes are among the most devastating ones, causing an elevated number of human losses and economic damages. But, we have to keep in mind that earthquakes don't kill people, buildings do. Earthquakes can't be predicted and the only way of dealing with their effects is to teach the society how to be prepared for them, and how to deal with their consequences. In spite of being exposed to moderate and large earthquakes, most of the Portuguese are little aware of seismic risk, mainly due to the long recurrence intervals between strong events. The acquisition of safe and correct attitudes before, during and after an earthquake is relevant for human security. Children play a determinant role in the establishment of a real and long-lasting "culture of prevention", both through action and new attitudes. On the other hand, when children assume correct behaviors, their relatives often change their incorrect behaviors to mimic the correct behaviors of their kids. In the framework of a Parents-in-Science initiative, we started with bi-monthly sessions for children aged 5 - 6 years old and 9 - 10 years old. These sessions, in which parents, teachers and high-school students participate, became part of the school's permanent activities. We start by a short introduction to the Earth and to earthquakes by story telling and by using simple science activities to trigger children curiosity. With safety purposes, we focus on how crucial it is to know basic information about themselves and to define, with their families, an emergency communications plan, in case family members are separated. Using a shaking table we teach them how to protect themselves during an earthquake. We then finish with the preparation on an individual emergency kit. This presentation will highlight the importance of encouraging preventive actions in order to reduce the impact of earthquakes on society. This project is developed by science high-school students and teachers, in

  15. Prediction of long term stability for geological disposal of radioactive waste

    International Nuclear Information System (INIS)

    Sasaki, Takeshi; Morikawa, Seiji; Koide, Hitoshi; Kono, Itoshi

    1998-01-01

    On geological disposal of radioactive wastes, study on prediction of diastrophism has been paid many attentions, and then long term future prediction ranging from some thousands to some tends thousands years may be necessary for some target nuclides. As there are various methods in the future prediction, it is essential to use a computational dynamic procedure to conduct a quantitative prediction. However, it causes an obstacle to advancement of the prediction method that informations on deep underground have a lot of uncertain elements because of their few and indirect data. In this paper, a long term prediction procedure of diastrophism relating to geological disposal of radioactive wastes with low level but isolation terms required to some thousands years was investigated and each one example was shown on flow of the investigation and its modeling method by using the finite element method. It seems to be a key to upgrade accuracy of future diastrophism prediction how an earth fault can be analyzed. And, as the diastrophism is a long term and complex phenomenon and its prediction has many uncertain elements, it is important to judge comprehensively results of its numerical analysis geologically and on rock engineering. (G.K.)

  16. Characterizing Aftershock Sequences of the Recent Strong Earthquakes in Central Italy

    Science.gov (United States)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2017-10-01

    The recent strong earthquakes in Central Italy allow for a comparative analysis of their aftershocks from the viewpoint of the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. In particular, we consider aftershocks as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere, each aftershock series characterized with the distribution of the USLE control parameter, η. We found the existence, in a long-term, of different, intermittent levels of rather steady seismic activity characterized with a near constant value of η, which switch, in mid-term, at times of transition associated with catastrophic events. On such a transition, seismic activity may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those as observed in the case of the ongoing one associated with the three strong earthquakes in 2016. Evidently, our results do not support the presence of universality of seismic energy release, while providing constraints on modelling seismic sequences for earthquake physicists and supplying decision makers with information for improving local seismic hazard assessments.

  17. A contrast study of the traumatic condition between the wounded in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake.

    Science.gov (United States)

    Ding, Sheng; Hu, Yonghe; Zhang, Zhongkui; Wang, Ting

    2015-01-01

    5.12 Wenchuan earthquake and 4.25 Nepal earthquake are of the similar magnitude, but the climate and geographic environment are totally different. Our team carried out medical rescue in both disasters, so we would like to compare the different traumatic conditions of the wounded in two earthquakes. The clinical data of the wounded respectively in 5.12 Wenchuan earthquake and 4.25 Nepal earthquake rescued by Chengdu Military General Hospital were retrospectively analyzed. Then a contrast study between the wounded was conducted in terms of age, sex, injury mechanisms, traumatic conditions, complications and prognosis. Three days after 5.12 Wenchuan earthquake, 465 cases of the wounded were hospitalized in Chengdu Military General Hospital, including 245 males (52.7%) and 220 females (47.3%) with the average age of (47.6±22.7) years. Our team carried out humanitarian relief in Katmandu after 4.25 Nepal earthquake. Three days after this disaster, 71 cases were treated in our field hospital, including 37 males (52.1%) and 34 females (47.9%) with the mean age of (44.8±22.9) years. There was no obvious difference in sex and mean age between two groups, but the age distribution was a little different: there were more wounded people at the age over 60 years in 4.25 Nepal earthquake (pearthquake (pearthquake had a higher rate of bruise injury and crush injury (pearthquake had a higher rate of falling injury (pearthquake, 4.25 Nepal earthquake has a much higher incidence of limb fractures (pearthquakes of the similar magnitude can cause different injury mechanisms, traumatic conditions and complications in the wounded under different climate and geographic environment.When an earthquake occurs in a poor traffic area of high altitude and large temperature difference, early medical rescue, injury control and wounded evacuation as well as sufficient warmth retention and food supply are of vital significance.

  18. The results of the pilot project in Georgia to install a network of electromagnetic radiation before the earthquake

    Science.gov (United States)

    Machavariani, Kakhaber; Khazaradze, Giorgi; Turazashvili, Ioseb; Kachakhidze, Nino; Kachakhidze, Manana; Gogoberidze, Vitali

    2016-04-01

    The world's scientific literature recently published many very important and interesting works of VLF / LF electromagnetic emissions, which is observed in the process of earthquake preparation. This works reliable earthquake prediction in terms of trends. Because, Georgia is located in Trans Asian earthquake zone, VLF / LF electromagnetic emissions network are essential. In this regard, it was possible to take first steps. It is true that our university has Shota Rustaveli National Science Foundation № DI / 21 / 9-140 / 13 grant, which included the installation of a receiver in Georgia, but failed due to lack of funds to buy this device. However, European friends helped us (Prof. Dr. PF Biagi and Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. Turkish scientists expedition in Georgia was organized in August 2015. They brought with them VLF / LF electromagnetic emissions receiver and together with Georgian scientists install near Tbilisi. The station was named GEO-TUR. It should be noted that Georgia was involved in the work of the European network. It is possible to completely control the earthquake in Georgia in terms of electromagnetic radiation. This enables scientists to obtain the relevant information not only on the territory of our country, but also on seismically active European countries as well. In order to maintain and develop our country in this new direction, it is necessary to keep independent group of scientists who will learn electromagnetic radiation ahead of an earthquake in Georgia. At this stage, we need to remedy this shortcoming, it is necessary and appropriate specialists to Georgia to engage in a joint international research. The work is carried out in the frame of grant (DI/21/9-140/13 „Pilot project of before earthquake detected Very Low Frequency/Low Frequency electromagnetic emission network installation in Georgia") by financial support of Shota Rustaveli National Science Foundation.

  19. Chest injuries associated with earthquakes: an analysis of injuries sustained during the 2008 Wen-Chuan earthquake in China.

    Science.gov (United States)

    Hu, Jia; Guo, Ying-Qiang; Zhang, Er-Yong; Tan, Jin; Shi, Ying-Kang

    2010-08-01

    The goal of this study was to analyze the patterns, therapeutic modalities, and short-term outcomes of patients with chest injuries in the aftermath of the Wen-Chuan earthquake, which occurred on May 12, 2008 and registered 8.0 on the Richter scale. Of the 1522 patients who were referred to the West China Hospital of Sichuan University from May 12 to May 27, 169 patients (11.1%) had suffered major chest injuries. The type of injury, the presence of infection, Abbreviated Injury Score (AIS 2005), New Injury Severity Score (NISS), treatment, and short-term outcome were all documented for each case. Isolated chest injuries were diagnosed in 129 patients (76.3%), while multiple injuries with a major chest trauma were diagnosed in 40 patients (23.7%). The mean AIS and the median NISS of the hospitalized patients with chest injuries were 2.5 and 13, respectively. The mortality rate was 3.0% (5 patients). Most of the chest injuries were classified as minor to moderate trauma; however, coexistent multiple injuries and subsequent infection should be carefully considered in medical response strategies. Coordinated efforts among emergency medical support groups and prior training in earthquake preparedness and rescue in earthquake-prone areas are therefore necessary for efficient evacuation and treatment of catastrophic casualties.

  20. Physics-Based Hazard Assessment for Critical Structures Near Large Earthquake Sources

    Science.gov (United States)

    Hutchings, L.; Mert, A.; Fahjan, Y.; Novikova, T.; Golara, A.; Miah, M.; Fergany, E.; Foxall, W.

    2017-09-01

    We argue that for critical structures near large earthquake sources: (1) the ergodic assumption, recent history, and simplified descriptions of the hazard are not appropriate to rely on for earthquake ground motion prediction and can lead to a mis-estimation of the hazard and risk to structures; (2) a physics-based approach can address these issues; (3) a physics-based source model must be provided to generate realistic phasing effects from finite rupture and model near-source ground motion correctly; (4) wave propagations and site response should be site specific; (5) a much wider search of possible sources of ground motion can be achieved computationally with a physics-based approach; (6) unless one utilizes a physics-based approach, the hazard and risk to structures has unknown uncertainties; (7) uncertainties can be reduced with a physics-based approach, but not with an ergodic approach; (8) computational power and computer codes have advanced to the point that risk to structures can be calculated directly from source and site-specific ground motions. Spanning the variability of potential ground motion in a predictive situation is especially difficult for near-source areas, but that is the distance at which the hazard is the greatest. The basis of a "physical-based" approach is ground-motion syntheses derived from physics and an understanding of the earthquake process. This is an overview paper and results from previous studies are used to make the case for these conclusions. Our premise is that 50 years of strong motion records is insufficient to capture all possible ranges of site and propagation path conditions, rupture processes, and spatial geometric relationships between source and site. Predicting future earthquake scenarios is necessary; models that have little or no physical basis but have been tested and adjusted to fit available observations can only "predict" what happened in the past, which should be considered description as opposed to prediction

  1. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    Science.gov (United States)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  2. Development of fragility functions to estimate homelessness after an earthquake

    Science.gov (United States)

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann

    2014-05-01

    Immediately after an earthquake, many stakeholders need to make decisions about their response. These decisions often need to be made in a data poor environment as accurate information on the impact can take months or even years to be collected and publicized. Social fragility functions have been developed and applied to provide an estimate of the impact in terms of building damage, deaths and injuries in near real time. These rough estimates can help governments and response agencies determine what aid may be required which can improve their emergency response and facilitate planning for longer term response. Due to building damage, lifeline outages, fear of aftershocks, or other causes, people may become displaced or homeless after an earthquake. Especially in cold and dangerous locations, the rapid provision of safe emergency shelter can be a lifesaving necessity. However, immediately after an event there is little information available about the number of homeless, their locations and whether they require public shelter to aid the response agencies in decision making. In this research, we analyze homelessness after historic earthquakes using the CATDAT Damaging Earthquakes Database. CATDAT includes information on the hazard as well as the physical and social impact of over 7200 damaging earthquakes from 1900-2013 (Daniell et al. 2011). We explore the relationship of both earthquake characteristics and area characteristics with homelessness after the earthquake. We consider modelled variables such as population density, HDI, year, measures of ground motion intensity developed in Daniell (2014) over the time period from 1900-2013 as well as temperature. Using a base methodology based on that used for PAGER fatality fragility curves developed by Jaiswal and Wald (2010), but using regression through time using the socioeconomic parameters developed in Daniell et al. (2012) for "socioeconomic fragility functions", we develop a set of fragility curves that can be

  3. Can mine tremors be predicted? Observational studies of earthquake nucleation, triggering and rupture in South African mines

    CSIR Research Space (South Africa)

    Durrheim, RJ

    2012-05-01

    Full Text Available Earthquakes, and the tsunamis and landslides they trigger, pose a serious risk to people living close to plate boundaries, and a lesser but still significant risk to inhabitants of stable continental regions where destructive earthquakes are rare... of experiments that seek to identify reliable precursors of damaging seismic events. 1. Introduction Earthquakes, and the tsunamis and landslides they trigger, pose a serious risk to people living close to plate boundaries, and a lesser but still significant...

  4. Time Domain Feature Extraction Technique for earth's electric field signal prior to the Earthquake

    International Nuclear Information System (INIS)

    Astuti, W; Sediono, W; Akmeliawati, R; Salami, M J E

    2013-01-01

    Earthquake is one of the most destructive of natural disasters that killed many people and destroyed a lot of properties. By considering these catastrophic effects, it is highly important of knowing ahead of earthquakes in order to reduce the number of victims and material losses. Earth's electric field is one of the features that can be used to predict earthquakes (EQs), since it has significant changes in the amplitude of the signal prior to the earthquake. This paper presents a detailed analysis of the earth's electric field due to earthquakes which occurred in Greece, between January 1, 2008 and June 30, 2008. In that period of time, 13 earthquakes had occurred. 6 of them were recorded with magnitudes greater than Ms=5R (5R), while 7 of them were recorded with magnitudes greater than Ms=6R (6R). Time domain feature extraction technique is applied to analyze the 1st significant changes in the earth's electric field prior to the earthquake. Two different time domain feature extraction techniques are applied in this work, namely Simple Square Integral (SSI) and Root Mean Square (RMS). The 1st significant change of the earth's electric field signal in each of monitoring sites is extracted using those two techniques. The feature extraction result can be used as input parameter for an earthquake prediction system

  5. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  6. Short-term wind power prediction based on LSSVM–GSA model

    International Nuclear Information System (INIS)

    Yuan, Xiaohui; Chen, Chen; Yuan, Yanbin; Huang, Yuehua; Tan, Qingxiong

    2015-01-01

    Highlights: • A hybrid model is developed for short-term wind power prediction. • The model is based on LSSVM and gravitational search algorithm. • Gravitational search algorithm is used to optimize parameters of LSSVM. • Effect of different kernel function of LSSVM on wind power prediction is discussed. • Comparative studies show that prediction accuracy of wind power is improved. - Abstract: Wind power forecasting can improve the economical and technical integration of wind energy into the existing electricity grid. Due to its intermittency and randomness, it is hard to forecast wind power accurately. For the purpose of utilizing wind power to the utmost extent, it is very important to make an accurate prediction of the output power of a wind farm under the premise of guaranteeing the security and the stability of the operation of the power system. In this paper, a hybrid model (LSSVM–GSA) based on the least squares support vector machine (LSSVM) and gravitational search algorithm (GSA) is proposed to forecast the short-term wind power. As the kernel function and the related parameters of the LSSVM have a great influence on the performance of the prediction model, the paper establishes LSSVM model based on different kernel functions for short-term wind power prediction. And then an optimal kernel function is determined and the parameters of the LSSVM model are optimized by using GSA. Compared with the Back Propagation (BP) neural network and support vector machine (SVM) model, the simulation results show that the hybrid LSSVM–GSA model based on exponential radial basis kernel function and GSA has higher accuracy for short-term wind power prediction. Therefore, the proposed LSSVM–GSA is a better model for short-term wind power prediction

  7. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  9. Ground-motion modeling of the 1906 San Francisco earthquake, part I: Validation using the 1989 Loma Prieta earthquake

    Science.gov (United States)

    Aagaard, Brad T.; Brocher, T.M.; Dolenc, D.; Dreger, D.; Graves, R.W.; Harmsen, S.; Hartzell, S.; Larsen, S.; Zoback, M.L.

    2008-01-01

    We compute ground motions for the Beroza (1991) and Wald et al. (1991) source models of the 1989 magnitude 6.9 Loma Prieta earthquake using four different wave-propagation codes and recently developed 3D geologic and seismic velocity models. In preparation for modeling the 1906 San Francisco earthquake, we use this well-recorded earthquake to characterize how well our ground-motion simulations reproduce the observed shaking intensities and amplitude and durations of recorded motions throughout the San Francisco Bay Area. All of the simulations generate ground motions consistent with the large-scale spatial variations in shaking associated with rupture directivity and the geologic structure. We attribute the small variations among the synthetics to the minimum shear-wave speed permitted in the simulations and how they accommodate topography. Our long-period simulations, on average, under predict shaking intensities by about one-half modified Mercalli intensity (MMI) units (25%-35% in peak velocity), while our broadband simulations, on average, under predict the shaking intensities by one-fourth MMI units (16% in peak velocity). Discrepancies with observations arise due to errors in the source models and geologic structure. The consistency in the synthetic waveforms across the wave-propagation codes for a given source model suggests the uncertainty in the source parameters tends to exceed the uncertainty in the seismic velocity structure. In agreement with earlier studies, we find that a source model with slip more evenly distributed northwest and southeast of the hypocenter would be preferable to both the Beroza and Wald source models. Although the new 3D seismic velocity model improves upon previous velocity models, we identify two areas needing improvement. Nevertheless, we find that the seismic velocity model and the wave-propagation codes are suitable for modeling the 1906 earthquake and scenario events in the San Francisco Bay Area.

  10. Modeling of a historical earthquake in Erzincan, Turkey (Ms 7.8, in 1939) using regional seismological information obtained from a recent event

    Science.gov (United States)

    Karimzadeh, Shaghayegh; Askan, Aysegul

    2018-04-01

    Located within a basin structure, at the conjunction of North East Anatolian, North Anatolian and Ovacik Faults, Erzincan city center (Turkey) is one of the most hazardous regions in the world. Combination of the seismotectonic and geological settings of the region has resulted in series of significant seismic activities including the 1939 (Ms 7.8) as well as the 1992 (Mw = 6.6) earthquakes. The devastative 1939 earthquake occurred in the pre-instrumental era in the region with no available local seismograms. Thus, a limited number of studies exist on that earthquake. However, the 1992 event, despite the sparse local network at that time, has been studied extensively. This study aims to simulate the 1939 Erzincan earthquake using available regional seismic and geological parameters. Despite several uncertainties involved, such an effort to quantitatively model the 1939 earthquake is promising, given the historical reports of extensive damage and fatalities in the area. The results of this study are expressed in terms of anticipated acceleration time histories at certain locations, spatial distribution of selected ground motion parameters and felt intensity maps in the region. Simulated motions are first compared against empirical ground motion prediction equations derived with both local and global datasets. Next, anticipated intensity maps of the 1939 earthquake are obtained using local correlations between peak ground motion parameters and felt intensity values. Comparisons of the estimated intensity distributions with the corresponding observed intensities indicate a reasonable modeling of the 1939 earthquake.

  11. Wrightwood and the earthquake cycle: What a long recurrence record tells us about how faults work

    Science.gov (United States)

    Weldon, R.; Scharer, K.; Fumal, T.; Biasi, G.

    2004-01-01

    The concept of the earthquake cycle is so well established that one often hears statements in the popular media like, "the Big One is overdue" and "the longer it waits, the bigger it will be." Surprisingly, data to critically test the variability in recurrence intervals, rupture displacements, and relationships between the two are almost nonexistent. To generate a long series of earthquake intervals and offsets, we have conducted paleoseismic investigations across the San Andreas fault near the town of Wrightwood, California, excavating 45 trenches over 18 years, and can now provide some answers to basic questions about recurrence behavior of large earthquakes. To date, we have characterized at least 30 prehistoric earthquakes in a 6000-yr-long record, complete for the past 1500 yr and for the interval 3000-1500 B.C. For the past 1500 yr, the mean recurrence interval is 105 yr (31-165 yr for individual intervals) and the mean slip is 3.2 m (0.7-7 m per event). The series is slightly more ordered than random and has a notable cluster of events, during which strain was released at 3 times the long-term average rate. Slip associated with an earthquake is not well predicted by the interval preceding it, and only the largest two earthquakes appear to affect the time interval to the next earthquake. Generally, short intervals tend to coincide with large displacements and long intervals with small displacements. The most significant correlation we find is that earthquakes are more frequent following periods of net strain accumulation spanning multiple seismic cycles. The extent of paleoearthquake ruptures may be inferred by correlating event ages between different sites along the San Andreas fault. Wrightwood and other nearby sites experience rupture that could be attributed to overlap of relatively independent segments that each behave in a more regular manner. However, the data are equally consistent with a model in which the irregular behavior seen at Wrightwood

  12. Probabilistic Seismic Hazard Assessment for Himalayan-Tibetan Region from Historical and Instrumental Earthquake Catalogs

    Science.gov (United States)

    Rahman, M. Moklesur; Bai, Ling; Khan, Nangyal Ghani; Li, Guohui

    2018-02-01

    The Himalayan-Tibetan region has a long history of devastating earthquakes with wide-spread casualties and socio-economic damages. Here, we conduct the probabilistic seismic hazard analysis by incorporating the incomplete historical earthquake records along with the instrumental earthquake catalogs for the Himalayan-Tibetan region. Historical earthquake records back to more than 1000 years ago and an updated, homogenized and declustered instrumental earthquake catalog since 1906 are utilized. The essential seismicity parameters, namely, the mean seismicity rate γ, the Gutenberg-Richter b value, and the maximum expected magnitude M max are estimated using the maximum likelihood algorithm assuming the incompleteness of the catalog. To compute the hazard value, three seismogenic source models (smoothed gridded, linear, and areal sources) and two sets of ground motion prediction equations are combined by means of a logic tree on accounting the epistemic uncertainties. The peak ground acceleration (PGA) and spectral acceleration (SA) at 0.2 and 1.0 s are predicted for 2 and 10% probabilities of exceedance over 50 years assuming bedrock condition. The resulting PGA and SA maps show a significant spatio-temporal variation in the hazard values. In general, hazard value is found to be much higher than the previous studies for regions, where great earthquakes have actually occurred. The use of the historical and instrumental earthquake catalogs in combination of multiple seismogenic source models provides better seismic hazard constraints for the Himalayan-Tibetan region.

  13. Reducing process delays for real-time earthquake parameter estimation - An application of KD tree to large databases for Earthquake Early Warning

    Science.gov (United States)

    Yin, Lucy; Andrews, Jennifer; Heaton, Thomas

    2018-05-01

    Earthquake parameter estimations using nearest neighbor searching among a large database of observations can lead to reliable prediction results. However, in the real-time application of Earthquake Early Warning (EEW) systems, the accurate prediction using a large database is penalized by a significant delay in the processing time. We propose to use a multidimensional binary search tree (KD tree) data structure to organize large seismic databases to reduce the processing time in nearest neighbor search for predictions. We evaluated the performance of KD tree on the Gutenberg Algorithm, a database-searching algorithm for EEW. We constructed an offline test to predict peak ground motions using a database with feature sets of waveform filter-bank characteristics, and compare the results with the observed seismic parameters. We concluded that large database provides more accurate predictions of the ground motion information, such as peak ground acceleration, velocity, and displacement (PGA, PGV, PGD), than source parameters, such as hypocenter distance. Application of the KD tree search to organize the database reduced the average searching process by 85% time cost of the exhaustive method, allowing the method to be feasible for real-time implementation. The algorithm is straightforward and the results will reduce the overall time of warning delivery for EEW.

  14. Iranian earthquakes, a uniform catalog with moment magnitudes

    Science.gov (United States)

    Karimiparidari, Sepideh; Zaré, Mehdi; Memarian, Hossein; Kijko, Andrzej

    2013-07-01

    A uniform earthquake catalog is an essential tool in any seismic hazard analysis. In this study, an earthquake catalog of Iran and adjacent areas was compiled, using international and national databanks. The following priorities were applied in selecting magnitude and earthquake location: (a) local catalogs were given higher priority for establishing the location of an earthquake and (b) global catalogs were preferred for determining earthquake magnitudes. Earthquakes that have occurred within the bounds between 23-42° N and 42-65° E, with a magnitude range of M W 3.5-7.9, from the third millennium BC until April 2010 were included. In an effort to avoid the "boundary effect," since the newly compiled catalog will be mainly used for seismic hazard assessment, the study area includes the areas adjacent to Iran. The standardization of the catalog in terms of magnitude was achieved by the conversion of all types of magnitude into moment magnitude, M W, by using the orthogonal regression technique. In the newly compiled catalog, all aftershocks were detected, based on the procedure described by Gardner and Knopoff (Bull Seismol Soc Am 64:1363-1367, 1974). The seismicity parameters were calculated for the six main tectonic seismic zones of Iran, i.e., the Zagros Mountain Range, the Alborz Mountain Range, Central Iran, Kope Dagh, Azerbaijan, and Makran.

  15. Seismic Safety Margins Research Program. Phase I. Interim definition of terms

    International Nuclear Information System (INIS)

    Smith, P.D.; Dong, R.G.

    1980-01-01

    This report documents interim definitions of terms in the Seismic Safety Margins Research Program (SSMRP). Intent is to establish a common-based terminology integral to the probabilistic methods that predict more realistically the behavior of nuclear power plants during an earthquake. These definitions are a response to a request by the Nuclear Regulatory Commission Advisory Committee on Reactor Safeguards at its meeting held November 15-16, 1979

  16. Reading a 400,000-year record of earthquake frequency for an intraplate fault.

    Science.gov (United States)

    Williams, Randolph T; Goodwin, Laurel B; Sharp, Warren D; Mozley, Peter S

    2017-05-09

    Our understanding of the frequency of large earthquakes at timescales longer than instrumental and historical records is based mostly on paleoseismic studies of fast-moving plate-boundary faults. Similar study of intraplate faults has been limited until now, because intraplate earthquake recurrence intervals are generally long (10s to 100s of thousands of years) relative to conventional paleoseismic records determined by trenching. Long-term variations in the earthquake recurrence intervals of intraplate faults therefore are poorly understood. Longer paleoseismic records for intraplate faults are required both to better quantify their earthquake recurrence intervals and to test competing models of earthquake frequency (e.g., time-dependent, time-independent, and clustered). We present the results of U-Th dating of calcite veins in the Loma Blanca normal fault zone, Rio Grande rift, New Mexico, United States, that constrain earthquake recurrence intervals over much of the past ∼550 ka-the longest direct record of seismic frequency documented for any fault to date. The 13 distinct seismic events delineated by this effort demonstrate that for >400 ka, the Loma Blanca fault produced periodic large earthquakes, consistent with a time-dependent model of earthquake recurrence. However, this time-dependent series was interrupted by a cluster of earthquakes at ∼430 ka. The carbon isotope composition of calcite formed during this seismic cluster records rapid degassing of CO 2 , suggesting an interval of anomalous fluid source. In concert with U-Th dates recording decreased recurrence intervals, we infer seismicity during this interval records fault-valve behavior. These data provide insight into the long-term seismic behavior of the Loma Blanca fault and, by inference, other intraplate faults.

  17. Reading a 400,000-year record of earthquake frequency for an intraplate fault

    Science.gov (United States)

    Williams, Randolph T.; Goodwin, Laurel B.; Sharp, Warren D.; Mozley, Peter S.

    2017-05-01

    Our understanding of the frequency of large earthquakes at timescales longer than instrumental and historical records is based mostly on paleoseismic studies of fast-moving plate-boundary faults. Similar study of intraplate faults has been limited until now, because intraplate earthquake recurrence intervals are generally long (10s to 100s of thousands of years) relative to conventional paleoseismic records determined by trenching. Long-term variations in the earthquake recurrence intervals of intraplate faults therefore are poorly understood. Longer paleoseismic records for intraplate faults are required both to better quantify their earthquake recurrence intervals and to test competing models of earthquake frequency (e.g., time-dependent, time-independent, and clustered). We present the results of U-Th dating of calcite veins in the Loma Blanca normal fault zone, Rio Grande rift, New Mexico, United States, that constrain earthquake recurrence intervals over much of the past ˜550 ka—the longest direct record of seismic frequency documented for any fault to date. The 13 distinct seismic events delineated by this effort demonstrate that for >400 ka, the Loma Blanca fault produced periodic large earthquakes, consistent with a time-dependent model of earthquake recurrence. However, this time-dependent series was interrupted by a cluster of earthquakes at ˜430 ka. The carbon isotope composition of calcite formed during this seismic cluster records rapid degassing of CO2, suggesting an interval of anomalous fluid source. In concert with U-Th dates recording decreased recurrence intervals, we infer seismicity during this interval records fault-valve behavior. These data provide insight into the long-term seismic behavior of the Loma Blanca fault and, by inference, other intraplate faults.

  18. Surface deformation associated with the November 23, 1977, Caucete, Argentina, earthquake sequence

    Science.gov (United States)

    Kadinsky-Cade, K.; Reilinger, R.; Isacks, B.

    1985-01-01

    The 1977 Caucete (San Juan) earthquake considered in the present paper occurred near the Sierra Pie de Palo in the Sierras Pampeanas tectonic province of western Argentina. In the study reported, coseismic surface deformation is combined with seismic observations (main shock and aftershocks, both teleseismic and local data) to place constraints on the geometry and slip of the main fault responsible for the 1977 earthquake. The implications of the 1977 event for long-term crustal shortening and earthquake recurrence rates in this region are also discussed. It is concluded that the 1977 Caucete earthquake was accompanied by more than 1 m of vertical uplift.

  19. Prediction and evaluation of nonlinear site response with potentially liquefiable layers in the area of Nafplion (Peloponnesus, Greece for a repeat of historical earthquakes

    Directory of Open Access Journals (Sweden)

    V. K. Karastathis

    2010-11-01

    Full Text Available We examine the possible non-linear behaviour of potentially liquefiable layers at selected sites located within the expansion area of the town of Nafplion, East Peloponnese, Greece. Input motion is computed for three scenario earthquakes, selected on the basis of historical seismicity data, using a stochastic strong ground motion simulation technique, which takes into account the finite dimensions of the earthquake sources. Site-specific ground acceleration synthetics and soil profiles are then used to evaluate the liquefaction potential at the sites of interest. The activation scenario of the Iria fault, which is the closest one to Nafplion (M=6.4, is found to be the most hazardous in terms of liquefaction initiation. In this scenario almost all the examined sites exhibit liquefaction features at depths of 6–12 m. For scenario earthquakes at two more distant seismic sources (Epidaurus fault – M6.3; Xylokastro fault – M6.7 strong ground motion amplification phenomena by the shallow soft soil layer are expected to be observed.

  20. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  1. An Artificial Neural Network Based Short-term Dynamic Prediction of Algae Bloom

    Directory of Open Access Journals (Sweden)

    Yao Junyang

    2014-06-01

    Full Text Available This paper proposes a method of short-term prediction of algae bloom based on artificial neural network. Firstly, principal component analysis is applied to water environmental factors in algae bloom raceway ponds to get main factors that influence the formation of algae blooms. Then, a model of short-term dynamic prediction based on neural network is built with the current chlorophyll_a values as input and the chlorophyll_a values in the next moment as output to realize short-term dynamic prediction of algae bloom. Simulation results show that the model can realize short-term prediction of algae bloom effectively.

  2. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  3. Statistical physics approach to earthquake occurrence and forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)

    2016-04-25

    different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  4. Testimonies to the L'Aquila earthquake (2009) and to the L'Aquila process

    Science.gov (United States)

    Kalenda, Pavel; Nemec, Vaclav

    2014-05-01

    Lot of confusions, misinformation, false solidarity, efforts to misuse geoethics and other unethical activities in favour of the top Italian seismologists responsible for a bad and superficial evaluation of the situation 6 days prior to the earthquake - that is a general characteristics for the whole period of 5 years separating us from the horrible morning of April 6, 2009 in L'Aquila with 309 human victims. The first author of this presentation as a seismologist had unusual opportunity to visit the unfortunate city in April 2009. He got all "first-hand" information that a real scientifically based prediction did exist already for some shocks in the area on March 29 and 30, 2009. The author of the prediction Gianpaolo Giuliani was obliged to stop any public information diffused by means of internet. A new prediction was known to him on March 31 - in the day when the "Commission of Great Risks" offered a public assurance that any immediate earthquake can be practically excluded. In reality the members of the commission completely ignored such a prediction declaring it as a false alarm of "somebody" (even without using the name of Giuliani). The observations by Giuliani were of high quality from the scientific point of view. G. Giuliani predicted L'Aquila earthquake in the professional way - for the first time during many years of observations. The anomalies, which preceded L'Aquila earthquake were detected on many places in Europe in the same time. The question is, what locality would be signed as potential focal area, if G. Giuliani would know the other observations in Europe. The deformation (and other) anomalies are observable before almost all of global M8 earthquakes. Earthquakes are preceded by deformation and are predictable. The testimony of the second author is based on many unfortunate personal experiences with representatives of the INGV Rome and their supporters from India and even Australia. In July 2010, prosecutor Fabio Picuti charged the Commission

  5. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Science.gov (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  6. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re-search...... of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset...

  7. Permeability, storage and hydraulic diffusivity controlled by earthquakes

    Science.gov (United States)

    Brodsky, E. E.; Fulton, P. M.; Xue, L.

    2016-12-01

    Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones

  8. GIS Based System for Post-Earthquake Crisis Managment Using Cellular Network

    Science.gov (United States)

    Raeesi, M.; Sadeghi-Niaraki, A.

    2013-09-01

    Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the destroyed areas. The search and rescue phase usually is maintained for many days. Time reduction for surviving people is very important. A Geographical Information System (GIS) can be used for decreasing response time and management in critical situations. Position estimation in short period of time time is important. This paper proposes a GIS based system for post-earthquake disaster management solution. This system relies on several mobile positioning methods such as cell-ID and TA method, signal strength method, angel of arrival method, time of arrival method and time difference of arrival method. For quick positioning, the system can be helped by any person who has a mobile device. After positioning and specifying the critical points, the points are sent to a central site for managing the procedure of quick response for helping. This solution establishes a quick way to manage the post-earthquake crisis.

  9. GIS BASED SYSTEM FOR POST-EARTHQUAKE CRISIS MANAGMENT USING CELLULAR NETWORK

    Directory of Open Access Journals (Sweden)

    M. Raeesi

    2013-09-01

    Full Text Available Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the destroyed areas. The search and rescue phase usually is maintained for many days. Time reduction for surviving people is very important. A Geographical Information System (GIS can be used for decreasing response time and management in critical situations. Position estimation in short period of time time is important. This paper proposes a GIS based system for post–earthquake disaster management solution. This system relies on several mobile positioning methods such as cell-ID and TA method, signal strength method, angel of arrival method, time of arrival method and time difference of arrival method. For quick positioning, the system can be helped by any person who has a mobile device. After positioning and specifying the critical points, the points are sent to a central site for managing the procedure of quick response for helping. This solution establishes a quick way to manage the post–earthquake crisis.

  10. Earthquake related displacement fields near underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Zandt, G.; Bouchon, M.

    1979-04-01

    Relative displacements of rock masses are evaluated in terms of geological evidence, seismological evidence, data from simulation experiments, and analytical predictive models. Numerical models have been developed to determine displacement fields as a function of depth, distance, and azimuth from an earthquake source. Computer calculations for several types of faults indicate that displacements decrease rapidly with distance from the fault, but that displacements can either increase or decrease as a function of depth depending on the type and geometry of the fault. For long shallow vertical strike-slip faults the displacement decreases markedly with depth. For square strike slip faults and for dip slip faults displacement does not decrease as markedly with depth. Geologic structure, material properties, and depth affect the seismic source spectrum. Amplification of the high frequencies of shear waves is larger by a factor of about 2 for layered geologic models than for an elastic half space

  11. Posttraumatic stress disorder: a serious post-earthquake complication

    Directory of Open Access Journals (Sweden)

    Mudassir Farooqui

    Full Text Available Abstract Objectives Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD Method A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD, posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. Results It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. Conclusion The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  12. Long-term change of site response after the M W 9.0 Tohoku earthquake in Japan

    Science.gov (United States)

    Wu, Chunquan; Peng, Zhigang

    2012-12-01

    The recent M W 9.0 off the Pacific coast of Tohoku earthquake is the largest recorded earthquake in Japan's history. The Tohoku main shock and its aftershocks generated widespread strong shakings as large as ~3000 Gal along the east coast of Japan. Wu and Peng (2011) found clear drop of resonant frequency of up to 70% during the Tohoku main shock at 6 sites and correlation of resonance (peak) frequency and peak ground acceleration (PGA) during the main shock. Here we follow that study and systematically analyze long-term changes of material properties in the shallow crust from one year before to 5 months after the Tohoku main shock, using seismic data recorded by the Japanese Strong Motion Network KiK-Net. We use sliding window spectral ratios computed from a pair of surface and borehole stations to track the temporal changes in the site response of 6 sites. Our results show two stages of logarithmic recovery after a sharp drop of resonance frequency during the Tohoku main shock. The first stage is a rapid recovery within several hundred seconds to several hours, and the second stage is a slow recovery of more than five months. We also investigate whether the damage caused by the Tohoku main shock could make the near surface layers more susceptible to further damages, but we do not observe clear changes in susceptibility to further damage before and after the Tohoku main shock.

  13. Temporal variation of gravity field prior to the Ludian Ms6.5 and Kangding Ms6.3 earthquakes

    Directory of Open Access Journals (Sweden)

    Hongtao Hao

    2015-11-01

    Full Text Available Using mobile gravity data from the central area of Sichuan and Yunnan Provinces, the relationship between gravity variation and earthquakes was studied based on the Ludian Ms6.5 earthquake that occurred on August 3rd, 2014, and the Kangding Ms6.3 earthquake that occurred on November 22nd, 2014; the mechanism of gravity variation was also explored. The results are as follows: (1 Prior to both earthquakes, gravity variation exhibited similar characteristics as those observed before both the Tangshan and Wenchuan earthquakes, in which typical precursor anomalies were positive gravity variation near the epicenter and the occurrence of a high-gravity-gradient zone across the epicenter prior to the earthquake. (2 A relatively accurate prediction of the occurrence locations of the two earthquakes was made by the Gravity Network Center of China (GNCC based on these precursor anomalies. In the gravity study report on the 2014 earthquake trends submitted at the end of 2013, the Daofu-Shimian section at the junction of the Xianshuihe and Longmenshan fault zones was noted as an earthquake-risk region with a predicted magnitude of 6.5, which covered the epicenter of the Kangding Ms6.3 earthquake. In another report on earthquake trends in southwestern China submitted in mid-2014, the Lianfeng, Zhaotong fault zone was also classified as an earthquake-risk region with a magnitude of 6.0, and the central area of this region basically overlapped with the epicenter of the Ludian Ms6.5 earthquake. (3 The gravity variation characteristics are reasonably consistent with crustal movements, and deep material migration is likely the primary cause of gravity variation.

  14. Statistics and Analysis of the Relations between Rainstorm Floods and Earthquakes

    Directory of Open Access Journals (Sweden)

    Baodeng Hou

    2016-01-01

    Full Text Available The frequent occurrence of geophysical disasters under climate change has drawn Chinese scholars to pay their attention to disaster relations. If the occurrence sequence of disasters could be identified, long-term disaster forecast could be realized. Based on the Earth Degassing Effect (EDE which is valid, this paper took the magnitude, epicenter, and occurrence time of the earthquake, as well as the epicenter and occurrence time of the rainstorm floods as basic factors to establish an integrated model to study the correlation between rainstorm floods and earthquakes. 2461 severe earthquakes occurred in China or within 3000 km from China and the 169 heavy rainstorm floods occurred in China over the past 200+ years as the input data of the model. The computational results showed that although most of the rainstorm floods have nothing to do with the severe earthquakes from a statistical perspective, some floods might relate to earthquakes. This is especially true when the earthquakes happen in the vapor transmission zone where rainstorms lead to abundant water vapors. In this regard, earthquakes are more likely to cause big rainstorm floods. However, many cases of rainstorm floods could be found after severe earthquakes with a large extent of uncertainty.

  15. Decision making biases in the communication of earthquake risk

    Science.gov (United States)

    Welsh, M. B.; Steacy, S.; Begg, S. H.; Navarro, D. J.

    2015-12-01

    L'Aquila, with 6 scientists convicted of manslaughter, shocked the scientific community, leading to urgent re-appraisal of communication methods for low-probability, high-impact events. Before the trial, a commission investigating the earthquake recommended risk assessment be formalised via operational earthquake forecasts and that social scientists be enlisted to assist in developing communication strategies. Psychological research has identified numerous decision biases relevant to this, including hindsight bias, where people (after the fact) overestimate an event's predictability. This affects experts as well as naïve participants as it relates to their ability to construct a plausible causal story rather than the likelihood of the event. Another problem is availability, which causes overestimation of the likelihood of observed rare events due to their greater noteworthiness. This, however, is complicated by the 'description-experience' gap, whereby people underestimate probabilities for events they have not experienced. That is, people who have experienced strong earthquakes judge them more likely while those who have not judge them less likely - relative to actual probabilities. Finally, format changes alter people's decisions. That is people treat '1 in 10,000' as different from 0.01% despite their mathematical equivalence. Such effects fall under the broad term framing, which describes how different framings of the same event alter decisions. In particular, people's attitude to risk depends significantly on how scenarios are described. We examine the effect of biases on the communication of change in risk. South Australian participants gave responses to scenarios describing familiar (bushfire) or unfamiliar (earthquake) risks. While bushfires are rare in specific locations, significant fire events occur each year and are extensively covered. By comparison, our study location (Adelaide) last had a M5 quake in 1954. Preliminary results suggest the description

  16. Initial Earthquake Centrifuge Model Experiments for the Study of Liquefaction

    National Research Council Canada - National Science Library

    Steedman, R

    1998-01-01

    .... These are intended to gather data suitable for the development of improved design approaches for the prediction of liquefaction under earthquake loading using the new centrifuge facility at the WES...

  17. Building the Southern California Earthquake Center

    Science.gov (United States)

    Jordan, T. H.; Henyey, T.; McRaney, J. K.

    2004-12-01

    Kei Aki was the founding director of the Southern California Earthquake Center (SCEC), a multi-institutional collaboration formed in 1991 as a Science and Technology Center (STC) under the National Science Foundation (NSF) and the U. S. Geological Survey (USGS). Aki and his colleagues articulated a system-level vision for the Center: investigations by disciplinary working groups would be woven together into a "Master Model" for Southern California. In this presentation, we will outline how the Master-Model concept has evolved and how SCEC's structure has adapted to meet scientific challenges of system-level earthquake science. In its first decade, SCEC conducted two regional imaging experiments (LARSE I & II); published the "Phase-N" reports on (1) the Landers earthquake, (2) a new earthquake rupture forecast for Southern California, and (3) new models for seismic attenuation and site effects; it developed two prototype "Community Models" (the Crustal Motion Map and Community Velocity Model) and, perhaps most important, sustained a long-term, multi-institutional, interdisciplinary collaboration. The latter fostered pioneering numerical simulations of earthquake ruptures, fault interactions, and wave propagation. These accomplishments provided the impetus for a successful proposal in 2000 to reestablish SCEC as a "stand alone" center under NSF/USGS auspices. SCEC remains consistent with the founders' vision: it continues to advance seismic hazard analysis through a system-level synthesis that is based on community models and an ever expanding array of information technology. SCEC now represents a fully articulated "collaboratory" for earthquake science, and many of its features are extensible to other active-fault systems and other system-level collaborations. We will discuss the implications of the SCEC experience for EarthScope, the USGS's program in seismic hazard analysis, NSF's nascent Cyberinfrastructure Initiative, and other large collaboratory programs.

  18. Effect of slip-area scaling on the earthquake frequency-magnitude relationship

    Science.gov (United States)

    Senatorski, Piotr

    2017-06-01

    The earthquake frequency-magnitude relationship is considered in the maximum entropy principle (MEP) perspective. The MEP suggests sampling with constraints as a simple stochastic model of seismicity. The model is based on the von Neumann's acceptance-rejection method, with b-value as the parameter that breaks symmetry between small and large earthquakes. The Gutenberg-Richter law's b-value forms a link between earthquake statistics and physics. Dependence between b-value and the rupture area vs. slip scaling exponent is derived. The relationship enables us to explain observed ranges of b-values for different types of earthquakes. Specifically, different b-value ranges for tectonic and induced, hydraulic fracturing seismicity is explained in terms of their different triggering mechanisms: by the applied stress increase and fault strength reduction, respectively.

  19. Earthquake research for the safer siting of critical facilities

    Energy Technology Data Exchange (ETDEWEB)

    Cluff, J.L. (ed.)

    1980-01-01

    The task of providing the necessities for living, such as adequate electrical power, water, and fuel, is becoming more complicated with time. Some of the facilities that provide these necessities would present potential hazards to the population if serious damage were to occur to them during earthquakes. Other facilities must remain operable immediately after an earthquake to provide life-support services to people who have been affected. The purpose of this report is to recommend research that will improve the information available to those who must decide where to site these critical facilities, and thereby mitigate the effects of the earthquake hazard. The term critical facility is used in this report to describe facilities that could seriously affect the public well-being through loss of life, large financial loss, or degradation of the environment if they were to fail. The term critical facility also is used to refer to facilities that, although they pose a limited hazard to the public, are considered critical because they must continue to function in the event of a disaster so that they can provide vital services.

  20. Performance of Earthquake Early Warning Systems during the Major Events of the 2016-2017 Central Italy Seismic Sequence.

    Science.gov (United States)

    Festa, G.; Picozzi, M.; Alessandro, C.; Colombelli, S.; Cattaneo, M.; Chiaraluce, L.; Elia, L.; Martino, C.; Marzorati, S.; Supino, M.; Zollo, A.

    2017-12-01

    Earthquake early warning systems (EEWS) are systems nowadays contributing to the seismic risk mitigation actions, both in terms of losses and societal resilience, by issuing an alert promptly after the earthquake origin and before the ground shaking impacts the targets to be protected. EEWS systems can be grouped in two main classes: network based and stand-alone systems. Network based EEWS make use of dense seismic networks surrounding the fault (e.g. Near Fault Observatory; NFO) generating the event. The rapid processing of the P-wave early portion allows for the location and magnitude estimation of the event then used to predict the shaking through ground motion prediction equations. Stand-alone systems instead analyze the early P-wave signal to predict the ground shaking carried by the late S or surface waves, through empirically calibrated scaling relationships, at the recording site itself. We compared the network-based (PRESTo, PRobabilistic and Evolutionary early warning SysTem, www.prestoews.org, Satriano et al., 2011) and the stand-alone (SAVE, on-Site-Alert-leVEl, Caruso et al., 2017) systems, by analyzing their performance during the 2016-2017 Central Italy sequence. We analyzed 9 earthquakes having magnitude 5.0 security actions. PRESTo also evaluated the accuracy of location and magnitude. Both systems well predict the ground shaking nearby the event source, with a success rate around 90% within the potential damage zone. The lead-time is significantly larger for the network based system, increasing to more than 10s at 40 km from the event epicentre. The stand-alone system better performs in the near-source region showing a positive albeit small lead-time (operational in Italy, based on the available acceleration networks, by improving the capability of reducing the lead-time related to data telemetry.

  1. Statistical Evaluations of Variations in Dairy Cows’ Milk Yields as a Precursor of Earthquakes

    Science.gov (United States)

    Yamauchi, Hiroyuki; Hayakawa, Masashi; Asano, Tomokazu; Ohtani, Nobuyo; Ohta, Mitsuaki

    2017-01-01

    Simple Summary There are many reports of abnormal changes occurring in various natural systems prior to earthquakes. Unusual animal behavior is one of these abnormalities; however, there are few objective indicators and to date, reliability has remained uncertain. We found that milk yields of dairy cows decreased prior to an earthquake in our previous case study. In this study, we examined the reliability of decreases in milk yields as a precursor for earthquakes using long-term observation data. In the results, milk yields decreased approximately three weeks before earthquakes. We have come to the conclusion that dairy cow milk yields have applicability as an objectively observable unusual animal behavior prior to earthquakes, and dairy cows respond to some physical or chemical precursors of earthquakes. Abstract Previous studies have provided quantitative data regarding unusual animal behavior prior to earthquakes; however, few studies include long-term, observational data. Our previous study revealed that the milk yields of dairy cows decreased prior to an extremely large earthquake. To clarify whether the milk yields decrease prior to earthquakes, we examined the relationship between earthquakes of various magnitudes and daily milk yields. The observation period was one year. In the results, cross-correlation analyses revealed a significant negative correlation between earthquake occurrence and milk yields approximately three weeks beforehand. Approximately a week and a half beforehand, a positive correlation was revealed, and the correlation gradually receded to zero as the day of the earthquake approached. Future studies that use data from a longer observation period are needed because this study only considered ten earthquakes and therefore does not have strong statistical power. Additionally, we compared the milk yields with the subionospheric very low frequency/low frequency (VLF/LF) propagation data indicating ionospheric perturbations. The results showed

  2. Listening to the 2011 magnitude 9.0 Tohoku-Oki, Japan, earthquake

    Science.gov (United States)

    Peng, Zhigang; Aiken, Chastity; Kilb, Debi; Shelly, David R.; Enescu, Bogdan

    2012-01-01

    The magnitude 9.0 Tohoku-Oki, Japan, earthquake on 11 March 2011 is the largest earthquake to date in Japan’s modern history and is ranked as the fourth largest earthquake in the world since 1900. This earthquake occurred within the northeast Japan subduction zone (Figure 1), where the Pacific plate is subducting beneath the Okhotsk plate at rate of ∼8–9 cm/yr (DeMets et al. 2010). This type of extremely large earthquake within a subduction zone is generally termed a “megathrust” earthquake. Strong shaking from this magnitude 9 earthquake engulfed the entire Japanese Islands, reaching a maximum acceleration ∼3 times that of gravity (3 g). Two days prior to the main event, a foreshock sequence occurred, including one earthquake of magnitude 7.2. Following the main event, numerous aftershocks occurred around the main slip region; the largest of these was magnitude 7.9. The entire foreshocks-mainshock-aftershocks sequence was well recorded by thousands of sensitive seismometers and geodetic instruments across Japan, resulting in the best-recorded megathrust earthquake in history. This devastating earthquake resulted in significant damage and high death tolls caused primarily by the associated large tsunami. This tsunami reached heights of more than 30 m, and inundation propagated inland more than 5 km from the Pacific coast, which also caused a nuclear crisis that is still affecting people’s lives in certain regions of Japan.

  3. Earthquakes and Tectonics Expert Judgment Elicitation Project

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Perman, R.C.; Youngs, R.R.

    1993-02-01

    This report summarizes the results of the Earthquakes and Tectonics Expert Judgement Excitation Project sponsored by the Electric Power Research Institute (EPRI). The objectives of this study were two-fold: (1) to demonstrate methods for the excitation of expert judgement, and (2) to quantify the uncertainties associated with earthquake and tectonics issues for use in the EPRI-HLW performance assessment. Specifically, the technical issue considered is the probability of differential fault displacement through the proposed repository at Yucca Mountain, Nevada. For this study, a strategy for quantifying uncertainties was developed that relies on the judgements of multiple experts. A panel of seven geologists and seismologists was assembled to quantify the uncertainties associated with earthquake and tectonics issues for the performance assessment model. A series of technical workshops focusing on these issues were conducted. Finally, each expert was individually interviewed in order to elicit his judgement regarding the technical issues and to provide the technical basis for his assessment. This report summarizes the methodologies used to elicit the judgements of the earthquakes and tectonics experts (termed ''specialists''), and summarizes the technical assessments made by the expert panel

  4. Prediction of short-term and long-term VOC emissions from SBR bitumen-backed carpet under different temperatures

    NARCIS (Netherlands)

    Yang, X.; Chen, Q.; Bluyssen, P.M.

    1998-01-01

    This paper presents two models for volatile organic compound (VOC) emissions from carpet. One is a numerical model using the computational fluid dynamics (CFD) tech-nique for short-term predictions, the other an analytical model for long-term predictions. The numerical model can (1) deal with

  5. Proposal on data collection for an international earthquake experience data

    International Nuclear Information System (INIS)

    Masopust, R.

    2001-01-01

    Earthquake experience data was recognized as an efficient basis for verification of seismic adequacy of equipment installed on NPPs. This paper is meant to initiate a database setup in order to use the seismic experience to establish the generic seismic resistance of NPPs equipment applicable namely to the Middle and East European countries. Such earthquake experience database should be then compared to the already existing and well-known SQUG-GIP database. To set up such an operational earthquake database will require an important amount of effort. It must be understood that this goal can be achieved only based on a long term permanent activities and coordinated cooperation of various institutions. (author)

  6. Real-Time Detection of Rupture Development: Earthquake Early Warning Using P Waves From Growing Ruptures

    Science.gov (United States)

    Kodera, Yuki

    2018-01-01

    Large earthquakes with long rupture durations emit P wave energy throughout the rupture period. Incorporating late-onset P waves into earthquake early warning (EEW) algorithms could contribute to robust predictions of strong ground motion. Here I describe a technique to detect in real time P waves from growing ruptures to improve the timeliness of an EEW algorithm based on seismic wavefield estimation. The proposed P wave detector, which employs a simple polarization analysis, successfully detected P waves from strong motion generation areas of the 2011 Mw 9.0 Tohoku-oki earthquake rupture. An analysis using 23 large (M ≥ 7) events from Japan confirmed that seismic intensity predictions based on the P wave detector significantly increased lead times without appreciably decreasing the prediction accuracy. P waves from growing ruptures, being one of the fastest carriers of information on ongoing rupture development, have the potential to improve the performance of EEW systems.

  7. Time history nonlinear earthquake response analysis considering materials and geometrical nonlinearity

    International Nuclear Information System (INIS)

    Kobayashi, T.; Yoshikawa, K.; Takaoka, E.; Nakazawa, M.; Shikama, Y.

    2002-01-01

    A time history nonlinear earthquake response analysis method was proposed and applied to earthquake response prediction analysis for a Large Scale Seismic Test (LSST) Program in Hualien, Taiwan, in which a 1/4 scale model of a nuclear reactor containment structure was constructed on sandy gravel layer. In the analysis both of strain-dependent material nonlinearity, and geometrical nonlinearity by base mat uplift, were considered. The 'Lattice Model' for the soil-structure interaction model was employed. An earthquake record on soil surface at the site was used as control motion, and deconvoluted to the input motion of the analysis model at GL-52 m with 300 Gal of maximum acceleration. The following two analyses were considered: (A) time history nonlinear, (B) equivalent linear, and the advantage of time history nonlinear earthquake response analysis method is discussed

  8. Earthquake Early Warning: User Education and Designing Effective Messages

    Science.gov (United States)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  9. Correlations between solid tides and worldwide earthquakes MS ≥ 7.0 since 1900

    Directory of Open Access Journals (Sweden)

    Q. H. Xu

    2012-03-01

    Full Text Available Most studies on the correlations between earthquakes and solid tides mainly concluded the syzygies (i.e. new or full moons of each lunar cycle have more earthquakes than other days in the month. We show a correlation between the aftershock sequence of the ML = 6.3 Christchurch, New Zealand, earthquake and the diurnal solid tide. Ms ≥ 7 earthquakes worldwide since 1900 are more likely to occur during the 0°, 90°, 180° or 270° phases (i.e. earthquake-prone phases of the semidiurnal solid earth tidal curve (M2. Thus, the semidiurnal solid tides triggers earthquakes. However, the long-term triggering effect of the lunar periodicity is uncertain. This proposal is helpful in defining possible origin times of aftershocks several days after a mainshock and can be used for warning of subsequent larger shocks.

  10. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  11. Prediction and prevention of rockburst in metal mines – A case study of Sanshandao gold mine

    Directory of Open Access Journals (Sweden)

    Meifeng Cai

    2016-04-01

    Full Text Available Rockburst is a kind of artificial earthquake induced by human activities, such as mining excavations. The mechanism of rockburst induced by mining disturbance is revealed in terms of energy in this context. For understanding the rockburst mechanism, two necessary conditions for the occurrence of rockburst are presented: (1 the rock mass has the capability to store huge amount of energy and possesses a strong bumping-prone characteristic when damaged; and (2 the geological conditions in the mining area have favorable geo-stress environments that can form high-stress concentration area and accumulate huge energy. These two conditions are also the basic criteria for prediction of rockburst. In view of energy analysis, it is observed that artificial and natural earthquakes have similar regularities in many aspects, such as the relationship between the energy value and burst magnitude. By using the relationship between energy and magnitude of natural earthquake, rockburst is predicted by disturbance energy analysis. A practical example is illustrated using the above-mentioned theorem and technique to predict rockburst in a gold mine in China. Finally, the prevention and control techniques of rockburst are also provided based on the knowledge of the rockburst mechanism.

  12. Human cervicovaginal fluid biomarkers to predict term and preterm labor

    Science.gov (United States)

    Heng, Yujing J.; Liong, Stella; Permezel, Michael; Rice, Gregory E.; Di Quinzio, Megan K. W.; Georgiou, Harry M.

    2015-01-01

    Preterm birth (PTB; birth before 37 completed weeks of gestation) remains the major cause of neonatal morbidity and mortality. The current generation of biomarkers predictive of PTB have limited utility. In pregnancy, the human cervicovaginal fluid (CVF) proteome is a reflection of the local biochemical milieu and is influenced by the physical changes occurring in the vagina, cervix and adjacent overlying fetal membranes. Term and preterm labor (PTL) share common pathways of cervical ripening, myometrial activation and fetal membranes rupture leading to birth. We therefore hypothesize that CVF biomarkers predictive of labor may be similar in both the term and preterm labor setting. In this review, we summarize some of the existing published literature as well as our team's breadth of work utilizing the CVF for the discovery and validation of putative CVF biomarkers predictive of human labor. Our team established an efficient method for collecting serial CVF samples for optimal 2-dimensional gel electrophoresis resolution and analysis. We first embarked on CVF biomarker discovery for the prediction of spontaneous onset of term labor using 2D-electrophoresis and solution array multiple analyte profiling. 2D-electrophoretic analyses were subsequently performed on CVF samples associated with PTB. Several proteins have been successfully validated and demonstrate that these biomarkers are associated with term and PTL and may be predictive of both term and PTL. In addition, the measurement of these putative biomarkers was found to be robust to the influences of vaginal microflora and/or semen. The future development of a multiple biomarker bed-side test would help improve the prediction of PTB and the clinical management of patients. PMID:26029118

  13. Long-Term Effects of the 2011 Japan Earthquake and Tsunami on Incidence of Fatal and Nonfatal Myocardial Infarction.

    Science.gov (United States)

    Nakamura, Motoyuki; Tanaka, Kentarou; Tanaka, Fumitaka; Matsuura, Yuuki; Komi, Ryousuke; Niiyama, Masanobu; Kawakami, Mikio; Koeda, Yorihiko; Sakai, Toshiaki; Onoda, Toshiyuki; Itoh, Tomonori

    2017-08-01

    This study aimed to examine the long-term effects of the 2011 Japan earthquake and tsunami on the incidence of fatal and nonfatal myocardial infarction (MI). In the present study, the incidence of 2 types of cardiac events was comprehensively recorded. The study area was divided into 2 zones based on the severity of tsunami damage, which was determined by the percentage of the inundated area within the residential area (tsunami (r = 0.77; p tsunami was associated with a continual increase in the incidence of fatal MI among disaster survivors. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    Science.gov (United States)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  15. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  16. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  17. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  18. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  19. Investigation of the ULF electromagnetic phenomena related to earthquakes. Contemporary achievements and the perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Kopytenko, Y.; Ismagilov, V. [SPbF Izmiran, St. Petersbourg (Russian Federation); Hayakawa, M [The University of Electro-Communications, Chofu, Tokyo (Japan); Smirnova, N.; Troyan, V. [St. Petersbourg Univ., St Petersbourg (Russian Federation). Inst. of Physics; Peterson, T. [TFPLAB, Cleveland, OH (United States)

    2001-04-01

    The results of ULF electromagnetic signal observations in seismoactive regions prior to earthquakes are presented and discussed. The new differential measurement technique developed in SPbF IZMIRAN for location of the ULF emission sources of space and lithosphere origin is described. The MVC-2DS geohphysical instrumentation is introduced as a promising tool for registration of ULF signals related to earthquakes (both seismic and electromagnetic ones). Methods are proposed for ULF data processing to investigate the preparation processes in the earthquake source regions and to distinguish seismogenic signals. on the background of space pulsations. Some examples of application of those methods for study of the earthquake precursory signatures are presented. Perspectives of seismo-electromagnetic tomography experiments in seismoactive regions, using MVC-2DS technique, are discussed in relation to the development of earthquake prediction methods.

  20. Changes in groundwater chemistry before two consecutive earthquakes in Iceland

    KAUST Repository

    Skelton, Alasdair

    2014-09-21

    Groundwater chemistry has been observed to change before earthquakes and is proposed as a precursor signal. Such changes include variations in radon count rates1, 2, concentrations of dissolved elements3, 4, 5 and stable isotope ratios4, 5. Changes in seismic wave velocities6, water levels in boreholes7, micro-seismicity8 and shear wave splitting9 are also thought to precede earthquakes. Precursor activity has been attributed to expansion of rock volume7, 10, 11. However, most studies of precursory phenomena lack sufficient data to rule out other explanations unrelated to earthquakes12. For example, reproducibility of a precursor signal has seldom been shown and few precursors have been evaluated statistically. Here we analyse the stable isotope ratios and dissolved element concentrations of groundwater taken from a borehole in northern Iceland between 2008 and 2013. We find that the chemistry of the groundwater changed four to six months before two greater than magnitude 5 earthquakes that occurred in October 2012 and April 2013. Statistical analyses indicate that the changes in groundwater chemistry were associated with the earthquakes. We suggest that the changes were caused by crustal dilation associated with stress build-up before each earthquake, which caused different groundwater components to mix. Although the changes we detect are specific for the site in Iceland, we infer that similar processes may be active elsewhere, and that groundwater chemistry is a promising target for future studies on the predictability of earthquakes.

  1. Changes in groundwater chemistry before two consecutive earthquakes in Iceland

    KAUST Repository

    Skelton, Alasdair; André n, Margareta; Kristmannsdó ttir, Hrefna; Stockmann, Gabrielle; Mö rth, Carl-Magnus; Sveinbjö rnsdó ttir, Á rny; Jonsson, Sigurjon; Sturkell, Erik; Guð rú nardó ttir, Helga Rakel; Hjartarson, Hreinn; Siegmund, Heike; Kockum, Ingrid

    2014-01-01

    Groundwater chemistry has been observed to change before earthquakes and is proposed as a precursor signal. Such changes include variations in radon count rates1, 2, concentrations of dissolved elements3, 4, 5 and stable isotope ratios4, 5. Changes in seismic wave velocities6, water levels in boreholes7, micro-seismicity8 and shear wave splitting9 are also thought to precede earthquakes. Precursor activity has been attributed to expansion of rock volume7, 10, 11. However, most studies of precursory phenomena lack sufficient data to rule out other explanations unrelated to earthquakes12. For example, reproducibility of a precursor signal has seldom been shown and few precursors have been evaluated statistically. Here we analyse the stable isotope ratios and dissolved element concentrations of groundwater taken from a borehole in northern Iceland between 2008 and 2013. We find that the chemistry of the groundwater changed four to six months before two greater than magnitude 5 earthquakes that occurred in October 2012 and April 2013. Statistical analyses indicate that the changes in groundwater chemistry were associated with the earthquakes. We suggest that the changes were caused by crustal dilation associated with stress build-up before each earthquake, which caused different groundwater components to mix. Although the changes we detect are specific for the site in Iceland, we infer that similar processes may be active elsewhere, and that groundwater chemistry is a promising target for future studies on the predictability of earthquakes.

  2. Seismogeodesy for rapid earthquake and tsunami characterization

    Science.gov (United States)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  3. Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.

    Science.gov (United States)

    Limongi, Roberto; Silva, Angélica M

    2016-11-01

    The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.

  4. Long-Term Prediction of Severe Hypoglycemia in Type 1 Diabetes

    DEFF Research Database (Denmark)

    Henriksen, Marie Moth; Færch, Louise; Thorsteinsson, Birger

    2016-01-01

    BACKGROUND: Prediction of risk of severe hypoglycemia (SH) in patients with type 1 diabetes is important to prevent future episodes, but it is unknown if it is possible to predict the long-term risk of SH. The aim of the study is to assess if long-term prediction of SH is possible in type 1...... diabetes. METHODS: A follow-up study was performed with 98 patients with type 1 diabetes. At baseline and at follow-up, the patients filled in a questionnaire about diabetes history and complications, number of SH in the preceding year and state of awareness, and HbA1c and C-peptide levels were measured......-up. CONCLUSIONS: Long-term prediction of severe hypoglycemia in type 1 diabetes was not possible, although baseline hypoglycemia unawareness tended to remain a predictor for risk of SH at follow-up. Therefore, it is important repeatedly to assess the different risk factors of SH to determine the actual risk....

  5. Predicting long-term graft survival in adult kidney transplant recipients

    Directory of Open Access Journals (Sweden)

    Brett W Pinsky

    2012-01-01

    Full Text Available The ability to accurately predict a population′s long-term survival has important implications for quantifying the benefits of transplantation. To identify a model that can accurately predict a kidney transplant population′s long-term graft survival, we retrospectively studied the United Network of Organ Sharing data from 13,111 kidney-only transplants completed in 1988- 1989. Nineteen-year death-censored graft survival (DCGS projections were calculated and com-pared with the population′s actual graft survival. The projection curves were created using a two-part estimation model that (1 fits a Kaplan-Meier survival curve immediately after transplant (Part A and (2 uses truncated observational data to model a survival function for long-term projection (Part B. Projection curves were examined using varying amounts of time to fit both parts of the model. The accuracy of the projection curve was determined by examining whether predicted sur-vival fell within the 95% confidence interval for the 19-year Kaplan-Meier survival, and the sample size needed to detect the difference in projected versus observed survival in a clinical trial. The 19-year DCGS was 40.7% (39.8-41.6%. Excellent predictability (41.3% can be achieved when Part A is fit for three years and Part B is projected using two additional years of data. Using less than five total years of data tended to overestimate the population′s long-term survival, accurate prediction of long-term DCGS is possible, but requires attention to the quantity data used in the projection method.

  6. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  7. Risk assessment study of fire following earthquake: a case study of petrochemical enterprises in China

    Science.gov (United States)

    Li, J.; Wang, Y.; Chen, H.; Lin, L.

    2013-04-01

    After an earthquake, the fire risk of petrochemistry enterprises is higher than that of other enterprises as it involves production processes with inflammable and explosive characteristics. Using Chinese petrochemical enterprises as the research object, this paper uses a literature review and case summaries to study, amongst others, the classification of petrochemical enterprises, the proportion of daily fires, and fire loss ratio. This paper builds a fire following earthquake risk assessment model of petrochemical enterprises based on a previous earthquake fire hazard model, and the earthquake loss prediction assessment method, calculates the expected loss of the fire following earthquake in various counties and draws a risk map. Moreover, this research identifies high-risk areas, concentrating on the Beijing-Tianjin-Tangshan region, and Shandong, Jiangsu, and Zhejiang provinces. Differences in enterprise type produce different levels and distribution of petrochemical enterprises earthquake fire risk. Furthermore, areas at high risk of post-earthquake fires and with low levels of seismic fortification require extra attention to ensure appropriate mechanisms are in place.

  8. Impact of the Northridge earthquake on the mental health of veterans: results from a panel study.

    Science.gov (United States)

    Dobalian, Aram; Stein, Judith A; Heslin, Kevin C; Riopelle, Deborah; Venkatesh, Brinda; Lanto, Andrew B; Simon, Barbara; Yano, Elizabeth M; Rubenstein, Lisa V

    2011-09-01

    The 1994 earthquake that struck Northridge, California, led to the closure of the Veterans Health Administration Medical Center at Sepulveda. This article examines the earthquake's impact on the mental health of an existing cohort of veterans who had previously used the Sepulveda Veterans Health Administration Medical Center. From 1 to 3 months after the disaster, trained interviewers made repeated attempts to contact participants by telephone to administer a repeated measures follow-up design survey based on a survey that had been done preearthquake. Postearthquake data were obtained on 1144 of 1800 (64%) male veterans for whom there were previous data. We tested a predictive latent variable path model of the relations between sociodemographic characteristics, predisaster physical and emotional health measures, and postdisaster emotional health and perceived earthquake impact. Perceived earthquake impact was predicted by predisaster emotional distress, functional limitations, and number of health conditions. Postdisaster emotional distress was predicted by preexisting emotional distress and earthquake impact. The regression coefficient from earthquake impact to postearthquake emotional distress was larger than that of the stability coefficient from preearthquake emotional distress. Postearthquake emotional distress also was affected indirectly by preearthquake emotional distress, health conditions, younger age, and lower socioeconomic status. The postdisaster emotional health of veterans who experienced greater earthquake impact would have likely benefited from postdisaster intervention, regardless of their predisaster emotional health. Younger veterans and veterans with generally poor physical and emotional health were more vulnerable to greater postearthquake emotional distress. Veterans of lower socioeconomic status were disproportionately likely to experience more effects of the disaster because they had more predisaster emotional distress, more functional

  9. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

    Science.gov (United States)

    Paluska, A.; Pavoni, N.

    1983-01-01

    Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

  10. Automatic Event Detection and Picking of P, S Seismic Phases for Earthquake Early Warning: A Case Study of the 2008 Wenchuan Earthquake

    Science.gov (United States)

    WANG, Z.; Zhao, B.

    2015-12-01

    We develop an automatic seismic phase arrival detection and picking algorithm for the impending earthquakes occurred with diverse focal mechanisms and depths. The polarization analysis of the three-component seismograms is utilized to distinguish between P and S waves through a sliding time window. When applying the short term average/long term average (STA/LTA) method to the polarized data, we also construct a new characteristics function that can sensitively reflect the changes of signals' amplitude and frequency, providing a better detection for the phase arrival. Then an improved combination method of the higher order statistics and the Akaike information criteria (AIC) picker is applied to the refined signal to lock on the arrival time with a higher degree of accuracy. We test our techniques to the aftershocks of the Ms8.0 Wenchuan earthquake, where hundreds of three-component acceleration records with magnitudes of 4.0 to 6.4 are treated. In comparison to the analyst picks, the results of the proposed detection algorithms are shown to perform well and can be applied from a single instrument within a network of stations for the large seismic events in the Earthquake Early Warning System (EEWS).

  11. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  12. Comprehensive analysis of earthquake source spectra in southern California

    OpenAIRE

    Shearer, Peter M.; Prieto, Germán A.; Hauksson, Egill

    2006-01-01

    We compute and analyze P wave spectra from earthquakes in southern California between 1989 and 2001 using a method that isolates source-, receiver-, and path-dependent terms. We correct observed source spectra for attenuation using both fixed and spatially varying empirical Green's function methods. Estimated Brune-type stress drops for over 60,000 M_L = 1.5 to 3.1 earthquakes range from 0.2 to 20 MPa with no dependence on moment or local b value. Median computed stress drop increases with de...

  13. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  14. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  15. Evaluation of the conservativeness of the methodology for estimating earthquake-induced movements of fractures intersecting canisters

    International Nuclear Information System (INIS)

    La Pointe, Paul R.; Cladouhos, Trenton T.; Outters, Nils; Follin, Sven

    2000-04-01

    This study evaluates the parameter sensitivity and the conservativeness of the methodology outlined in TR 99-03. Sensitivity analysis focuses on understanding how variability in input parameter values impacts the calculated fracture displacements. These studies clarify what parameters play the greatest role in fracture movements, and help define critical values of these parameters in terms of canister failures. The thresholds or intervals of values that lead to a certain level of canister failure calculated in this study could be useful for evaluating future candidate sites. Key parameters include: 1. magnitude/frequency of earthquakes; 2. the distance of the earthquake from the canisters; 3. the size and aspect ratio of fractures intersecting canisters; and 4. the orientation of the fractures. The results of this study show that distance and earthquake magnitude are the most important factors, followed by fracture size. Fracture orientation is much less important. Regression relations were developed to predict induced fracture slip as a function of distance and either earthquake magnitude or slip on the earthquake fault. These regression relations were validated by using them to estimate the number of canister failures due to single damaging earthquakes at Aberg, and comparing these estimates with those presented in TR 99-03. The methodology described in TR 99-03 employs several conservative simplifications in order to devise a numerically feasible method to estimate fracture movements due to earthquakes outside of the repository over the next 100,000 years. These simplifications include: 1. fractures are assumed to be frictionless and cohesionless; 2. all energy transmitted to the fracture by the earthquake is assumed to produce elastic deformation of the fracture; no energy is diverted into fracture propagation; and 3. shielding effects of other fractures between the earthquake and the fracture are neglected. The numerical modeling effectively assumes that the

  16. Evaluation of the conservativeness of the methodology for estimating earthquake-induced movements of fractures intersecting canisters

    Energy Technology Data Exchange (ETDEWEB)

    La Pointe, Paul R.; Cladouhos, Trenton T. [Golder Associates Inc., Las Vegas, NV (United States); Outters, Nils; Follin, Sven [Golder Grundteknik KB, Stockholm (Sweden)

    2000-04-01

    This study evaluates the parameter sensitivity and the conservativeness of the methodology outlined in TR 99-03. Sensitivity analysis focuses on understanding how variability in input parameter values impacts the calculated fracture displacements. These studies clarify what parameters play the greatest role in fracture movements, and help define critical values of these parameters in terms of canister failures. The thresholds or intervals of values that lead to a certain level of canister failure calculated in this study could be useful for evaluating future candidate sites. Key parameters include: 1. magnitude/frequency of earthquakes; 2. the distance of the earthquake from the canisters; 3. the size and aspect ratio of fractures intersecting canisters; and 4. the orientation of the fractures. The results of this study show that distance and earthquake magnitude are the most important factors, followed by fracture size. Fracture orientation is much less important. Regression relations were developed to predict induced fracture slip as a function of distance and either earthquake magnitude or slip on the earthquake fault. These regression relations were validated by using them to estimate the number of canister failures due to single damaging earthquakes at Aberg, and comparing these estimates with those presented in TR 99-03. The methodology described in TR 99-03 employs several conservative simplifications in order to devise a numerically feasible method to estimate fracture movements due to earthquakes outside of the repository over the next 100,000 years. These simplifications include: 1. fractures are assumed to be frictionless and cohesionless; 2. all energy transmitted to the fracture by the earthquake is assumed to produce elastic deformation of the fracture; no energy is diverted into fracture propagation; and 3. shielding effects of other fractures between the earthquake and the fracture are neglected. The numerical modeling effectively assumes that the

  17. THE RESPONSE OF MONTEREY BAY TO THE 2010 CHILEAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    Laurence C. Breaker

    2011-01-01

    Full Text Available The primary frequencies contained in the arrival sequence produced by the tsunami from the Chilean earthquake of 2010 in Monterey Bay were extracted to determine the seiche modes that were produced. Singular Spectrum Analysis (SSA and Ensemble Empirical Mode Decomposition (EEMD were employed to extract the primary frequencies of interest. The wave train from the Chilean tsunami lasted for at least four days due to multipath arrivals that may not have included reflections from outside the bay but most likely did include secondary undulations, and energy trapping in the form of edge waves, inside the bay. The SSA decomposition resolved oscillations with periods of 52-57, 34-35, 26-27, and 21-22 minutes, all frequencies that have been predicted and/or observed in previous studies. The EEMD decomposition detected oscillations with periods of 50-55 and 21-22 minutes. Periods in the range of 50-57 minutes varied due to measurement uncertainties but almost certainly correspond to the first longitudinal mode of oscillation for Monterey Bay, periods of 34-35 minutes correspond to the first transverse mode of oscillation that assumes a nodal line across the entrance of the bay, a period of 26- 27 minutes, although previously observed, may not represent a fundamental oscillation, and a period of 21-22 minutes has been predicted and observed previously. A period of ~37 minutes, close to the period of 34-35 minutes, was generated by the Great Alaskan Earthquake of 1964 in Monterey Bay and most likely represents the same mode of oscillation. The tsunamis associated with the Great Alaskan Earthquake and the Chilean Earthquake both entered Monterey Bay but initially arrived outside the bay from opposite directions. Unlike the Great Alaskan Earthquake, however, which excited only one resonant mode inside the bay, the Chilean Earthquake excited several modes suggesting that the asymmetric shape of the entrance to Monterey Bay was an important factor and that the

  18. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    Science.gov (United States)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for

  19. The debate on the prognostic value of earthquake foreshocks: a meta-analysis.

    Science.gov (United States)

    Mignan, Arnaud

    2014-02-14

    The hypothesis that earthquake foreshocks have a prognostic value is challenged by simulations of the normal behaviour of seismicity, where no distinction between foreshocks, mainshocks and aftershocks can be made. In the former view, foreshocks are passive tracers of a tectonic preparatory process that yields the mainshock (i.e., loading by aseismic slip) while in the latter, a foreshock is any earthquake that triggers a larger one. Although both processes can coexist, earthquake prediction is plausible in the first case while virtually impossible in the second. Here I present a meta-analysis of 37 foreshock studies published between 1982 and 2013 to show that the justification of one hypothesis or the other depends on the selected magnitude interval between minimum foreshock magnitude m(min) and mainshock magnitude M. From this literature survey, anomalous foreshocks are found to emerge when m(min) < M - 3.0. These results suggest that a deviation from the normal behaviour of seismicity may be observed only when microseismicity is considered. These results are to be taken with caution since the 37 studies do not all show the same level of reliability. These observations should nonetheless encourage new research in earthquake predictability with focus on the potential role of microseismicity.

  20. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11