WorldWideScience

Sample records for term earthquake prediction

  1. Statistical short-term earthquake prediction.

    Science.gov (United States)

    Kagan, Y Y; Knopoff, L

    1987-06-19

    A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

  2. Stabilizing intermediate-term medium-range earthquake predictions

    International Nuclear Information System (INIS)

    Kossobokov, V.G.; Romashkova, L.L.; Panza, G.F.; Peresan, A.

    2001-12-01

    A new scheme for the application of the intermediate-term medium-range earthquake prediction algorithm M8 is proposed. The scheme accounts for the natural distribution of seismic activity, eliminates the subjectivity in the positioning of the areas of investigation and provides additional stability of the predictions with respect to the original variant. According to the retroactive testing in Italy and adjacent regions, this improvement is achieved without any significant change of the alarm volume in comparison with the results published so far. (author)

  3. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  4. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

    Science.gov (United States)

    Heki, K.; He, L.

    2017-12-01

    We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

  5. VAN method of short-term earthquake prediction shows promise

    Science.gov (United States)

    Uyeda, Seiya

    Although optimism prevailed in the 1970s, the present consensus on earthquake prediction appears to be quite pessimistic. However, short-term prediction based on geoelectric potential monitoring has stood the test of time in Greece for more than a decade [VarotsosandKulhanek, 1993] Lighthill, 1996]. The method used is called the VAN method.The geoelectric potential changes constantly due to causes such as magnetotelluric effects, lightning, rainfall, leakage from manmade sources, and electrochemical instabilities of electrodes. All of this noise must be eliminated before preseismic signals are identified, if they exist at all. The VAN group apparently accomplished this task for the first time. They installed multiple short (100-200m) dipoles with different lengths in both north-south and east-west directions and long (1-10 km) dipoles in appropriate orientations at their stations (one of their mega-stations, Ioannina, for example, now has 137 dipoles in operation) and found that practically all of the noise could be eliminated by applying a set of criteria to the data.

  6. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    Science.gov (United States)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same

  7. Long-term predictability of regions and dates of strong earthquakes

    Science.gov (United States)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  8. The USGS plan for short-term prediction of the anticipated Parkfield earthquake

    Science.gov (United States)

    Bakun, W.H.

    1988-01-01

    Aside from the goal of better understanding the Parkfield earthquake cycle, it is the intention of the U.S Geological Survey to attempt to issue a warning shortly before the anticipated earthquake. Although short-term earthquake warnings are not yet generally feasible, the wealth of information available for the previous significant Parkfield earthquakes suggests that if the next earthquake follows the pattern of "characteristic" Parkfield shocks, such a warning might be possible. Focusing on earthquake precursors reported for the previous  "characteristic" shocks, particulary the 1934 and 1966 events, the USGS developed a plan* in late 1985 on which to base earthquake warnings for Parkfield and has assisted State, county, and local officials in the Parkfield area to prepare a coordinated, reasonable response to a warning, should one be issued. 

  9. Intermediate-term earthquake prediction and seismic zoning in Northern Italy

    International Nuclear Information System (INIS)

    Panza, G.F.; Orozova Stanishkova, I.; Costa, G.; Vaccari, F.

    1993-12-01

    The algorithm CN for intermediate earthquake prediction has been applied to an area in Northern Italy, which has been chosen according to a recently proposed seismotectonic model. Earthquakes with magnitude ≥ 5.4 occur in the area with a relevant frequency and their occurrence is predicted by algorithm CN. Therefore a seismic hazard analysis has been performed using a deterministic procedure, based on the computation of complete synthetic seismograms. The results are summarized in a map giving the distribution of peak ground acceleration, but the complete time series are available, which can be used by civil engineers in the design of new seismo-resistant constructions and in the retrofitting of the existing ones. This risk reduction action should be intensified in connection with warnings issued on the basis of the forward predictions made by CN. (author). Refs, 7 figs, 1 tab

  10. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults.

    Science.gov (United States)

    McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H

    2005-03-24

    East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion.

  11. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  12. Intermediate-term medium-range earthquake prediction algorithm M8: A new spatially stabilized application in Italy

    International Nuclear Information System (INIS)

    Romashkova, L.L.; Kossobokov, V.G.; Peresan, A.; Panza, G.F.

    2001-12-01

    A series of experiments, based on the intermediate-term earthquake prediction algorithm M8, has been performed for the retrospective simulation of forward predictions in the Italian territory, with the aim to design an experimental routine for real-time predictions. These experiments evidenced two main difficulties for the application of M8 in Italy. The first one is due to the fact that regional catalogues are usually limited in space. The second one concerns certain arbitrariness and instability, with respect to the positioning of the circles of investigation. Here we design a new scheme for the application of the algorithm M8, which is less subjective and less sensitive to the position of the circles of investigation. To perform this test, we consider a recent revision of the Italian catalogue, named UCI2001, composed by CCI1996, NEIC and ALPOR data for the period 1900-1985, and updated with the NEIC reduces the spatial heterogeneity of the data at the boundaries of Italy. The new variant of the M8 algorithm application reduces the number of spurious alarms and increases the reliability of predictions. As a result, three out of four earthquakes with magnitude M max larger than 6.0 are predicted in the retrospective simulation of the forward prediction, during the period 1972-2001, with a space-time volume of alarms comparable to that obtained with the non-stabilized variant of the M8 algorithm in Italy. (author)

  13. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  14. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  15. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  16. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  17. Earthquake prediction the ory and its relation to precursors

    International Nuclear Information System (INIS)

    Negarestani, A.; Setayeshi, S.; Ghannadi-Maragheh, M.; Akasheh, B.

    2001-01-01

    Since we don't have enough knowledge about the Physics of earthquakes. therefore. the study of seismic precursors plays an important role in earthquake prediction. Earthquake prediction is a science which discusses about precursory phenomena during seismogenic process, and then investigates the correlation and association among them and the intrinsic relation between precursors and the seismogenic process. ar the end judges comprehensively the seismic status and finally makes earthquake prediction. There are two ways for predicting earthquake prediction. The first is to study the physics of seismogenic process and to determine the parameters in the process based on the source theories and the second way is to use seismic precursors. In this paper the theory of earthquake is reviewed. We also study theory of earthquake using models of earthquake origin, the relation between seismogenic process and various accompanying precursory phenomena. The earthquake prediction is divided into three categories: long-term, medium-term and short-term. We study seismic anomalous behavior. electric field, crustal deformation, gravity. magnetism of earth. change of groundwater variation. groundwater geochemistry and change of Radon gas emission. Finally, it is concluded the there is a correlation between Radon gas emission and earthquake phenomena. Meanwhile, there are some samples from actual processing in this area

  18. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  19. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  20. Is It Possible to Predict Strong Earthquakes?

    Science.gov (United States)

    Polyakov, Y. S.; Ryabinin, G. V.; Solovyeva, A. B.; Timashev, S. F.

    2015-07-01

    The possibility of earthquake prediction is one of the key open questions in modern geophysics. We propose an approach based on the analysis of common short-term candidate precursors (2 weeks to 3 months prior to strong earthquake) with the subsequent processing of brain activity signals generated in specific types of rats (kept in laboratory settings) who reportedly sense an impending earthquake a few days prior to the event. We illustrate the identification of short-term precursors using the groundwater sodium-ion concentration data in the time frame from 2010 to 2014 (a major earthquake occurred on 28 February 2013) recorded at two different sites in the southeastern part of the Kamchatka Peninsula, Russia. The candidate precursors are observed as synchronized peaks in the nonstationarity factors, introduced within the flicker-noise spectroscopy framework for signal processing, for the high-frequency component of both time series. These peaks correspond to the local reorganizations of the underlying geophysical system that are believed to precede strong earthquakes. The rodent brain activity signals are selected as potential "immediate" (up to 2 weeks) deterministic precursors because of the recent scientific reports confirming that rodents sense imminent earthquakes and the population-genetic model of K irshvink (Soc Am 90, 312-323, 2000) showing how a reliable genetic seismic escape response system may have developed over the period of several hundred million years in certain animals. The use of brain activity signals, such as electroencephalograms, in contrast to conventional abnormal animal behavior observations, enables one to apply the standard "input-sensor-response" approach to determine what input signals trigger specific seismic escape brain activity responses.

  1. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  2. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  3. Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  4. The October 1992 Parkfield, California, earthquake prediction

    Science.gov (United States)

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  5. Quantitative Earthquake Prediction on Global and Regional Scales

    International Nuclear Information System (INIS)

    Kossobokov, Vladimir G.

    2006-01-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  6. Quantitative Earthquake Prediction on Global and Regional Scales

    Science.gov (United States)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  7. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  8. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  9. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  10. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  11. A mathematical model for predicting earthquake occurrence ...

    African Journals Online (AJOL)

    We consider the continental crust under damage. We use the observed results of microseism in many seismic stations of the world which was established to study the time series of the activities of the continental crust with a view to predicting possible time of occurrence of earthquake. We consider microseism time series ...

  12. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  13. 78 FR 64973 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2013-10-30

    ... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] National Earthquake Prediction...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a... Council shall advise the Director of the U.S. Geological Survey on proposed earthquake predictions, on the...

  14. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-11-09

    ... DEPARTMENT OF THE INTERIOR U.S. Geological Survey National Earthquake Prediction Evaluation... 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\-day meeting.... Geological Survey on proposed earthquake predictions, on the completeness and scientific validity of the...

  15. 76 FR 19123 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-04-06

    ... Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey, Interior. ACTION: Notice of meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  16. Signals of ENPEMF Used in Earthquake Prediction

    Science.gov (United States)

    Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

    2012-12-01

    The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

  17. Short-term and long-term earthquake occurrence models for Italy: ETES, ERS and LTST

    Directory of Open Access Journals (Sweden)

    Maura Murru

    2010-11-01

    Full Text Available This study describes three earthquake occurrence models as applied to the whole Italian territory, to assess the occurrence probabilities of future (M ≥5.0 earthquakes: two as short-term (24 hour models, and one as long-term (5 and 10 years. The first model for short-term forecasts is a purely stochastic epidemic type earthquake sequence (ETES model. The second short-term model is an epidemic rate-state (ERS forecast based on a model that is physically constrained by the application to the earthquake clustering of the Dieterich rate-state constitutive law. The third forecast is based on a long-term stress transfer (LTST model that considers the perturbations of earthquake probability for interacting faults by static Coulomb stress changes. These models have been submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP for forecast testing for Italy (ETH-Zurich, and they were locked down to test their validity on real data in a future setting starting from August 1, 2009.

  18. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  19. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2012-08-31

    ... DEPARTMENT OF THE INTERIOR Geological Survey [USGS-GX12GG00995NP00] National Earthquake Prediction... meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council... National Earthquake Information Center (NEIC), 1711 Illinois Avenue, Golden, Colorado 80401. The Council is...

  20. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  1. Three Millennia of Seemingly Time-Predictable Earthquakes, Tell Ateret

    Science.gov (United States)

    Agnon, Amotz; Marco, Shmuel; Ellenblum, Ronnie

    2014-05-01

    Among various idealized recurrence models of large earthquakes, the "time-predictable" model has a straightforward mechanical interpretation, consistent with simple friction laws. On a time-predictable fault, the time interval between an earthquake and its predecessor is proportional to the slip during the predecessor. The alternative "slip-predictable" model states that the slip during earthquake rupture is proportional to the preceding time interval. Verifying these models requires extended records of high precision data for both timing and amount of slip. The precision of paleoearthquake data can rarely confirm or rule out predictability, and recent papers argue for either time- or slip-predictable behavior. The Ateret site, on the trace of the Dead Sea fault at the Jordan Gorge segment, offers unique precision for determining space-time patterns. Five consecutive slip events, each associated with deformed and offset sets of walls, are correlated with historical earthquakes. Two correlations are based on detailed archaeological, historical, and numismatic evidence. The other three are tentative. The offsets of three of the events are determined with high precision; the other two are not as certain. Accepting all five correlations, the fault exhibits a striking time-predictable behavior, with a long term slip rate of 3 mm/yr. However, the 30 October 1759 ~0.5 m rupture predicts a subsequent rupture along the Jordan Gorge toward the end of the last century. We speculate that earthquakres on secondary faults (the 25 November 1759 on the Rachaya branch and the 1 January 1837 on the Roum branch, both M≥7) have disrupted the 3 kyr time-predictable pattern.

  2. Testing for the 'predictability' of dynamically triggered earthquakes in The Geysers geothermal field

    Science.gov (United States)

    Aiken, Chastity; Meng, Xiaofeng; Hardebeck, Jeanne

    2018-03-01

    The Geysers geothermal field is well known for being susceptible to dynamic triggering of earthquakes by large distant earthquakes, owing to the introduction of fluids for energy production. Yet, it is unknown if dynamic triggering of earthquakes is 'predictable' or whether dynamic triggering could lead to a potential hazard for energy production. In this paper, our goal is to investigate the characteristics of triggering and the physical conditions that promote triggering to determine whether or not triggering is in anyway foreseeable. We find that, at present, triggering in The Geysers is not easily 'predictable' in terms of when and where based on observable physical conditions. However, triggered earthquake magnitude positively correlates with peak imparted dynamic stress, and larger dynamic stresses tend to trigger sequences similar to mainshock-aftershock sequences. Thus, we may be able to 'predict' what size earthquakes to expect at The Geysers following a large distant earthquake.

  3. Gambling scores for earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  4. Stigma in science: the case of earthquake prediction.

    Science.gov (United States)

    Joffe, Helene; Rossetto, Tiziana; Bradley, Caroline; O'Connor, Cliodhna

    2018-01-01

    This paper explores how earthquake scientists conceptualise earthquake prediction, particularly given the conviction of six earthquake scientists for manslaughter (subsequently overturned) on 22 October 2012 for having given inappropriate advice to the public prior to the L'Aquila earthquake of 6 April 2009. In the first study of its kind, semi-structured interviews were conducted with 17 earthquake scientists and the transcribed interviews were analysed thematically. The scientists primarily denigrated earthquake prediction, showing strong emotive responses and distancing themselves from earthquake 'prediction' in favour of 'forecasting'. Earthquake prediction was regarded as impossible and harmful. The stigmatisation of the subject is discussed in the light of research on boundary work and stigma in science. The evaluation reveals how mitigation becomes the more favoured endeavour, creating a normative environment that disadvantages those who continue to pursue earthquake prediction research. Recommendations are made for communication with the public on earthquake risk, with a focus on how scientists portray uncertainty. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  5. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  6. Gambling score in earthquake prediction analysis

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  7. Prediction of earthquakes: a data evaluation and exchange problem

    Energy Technology Data Exchange (ETDEWEB)

    Melchior, Paul

    1978-11-15

    Recent experiences in earthquake prediction are recalled. Precursor information seems to be available from geodetic measurements, hydrological and geochemical measurements, electric and magnetic measurements, purely seismic phenomena, and zoological phenomena; some new methods are proposed. A list of possible earthquake triggers is given. The dilatancy model is contrasted with a dry model; they seem to be equally successful. In conclusion, the space and time range of the precursors is discussed in relation to the magnitude of earthquakes. (RWR)

  8. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  9. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    2Institute of Earthquake Prediction Theory and Mathematical Geophysics, ... located about 70 km from a preceding M7.3 earthquake that occurred in ... local extremes of the seismic density distribution, and in the third approach, CI centers were distributed ...... Bird P 2003 An updated digital model of plate boundaries;.

  10. Implications of fault constitutive properties for earthquake prediction.

    Science.gov (United States)

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  11. Some considerations regarding earthquake prediction - The case of Vrancea region -

    International Nuclear Information System (INIS)

    Enescu, Bogdan; Enescu, Dumitru

    2000-01-01

    Earthquake prediction research has been conducted for over 100 years with no obvious success. In the last year, the new modern concepts regarding the earthquake dynamics added another source of skepticism regarding the possibility of predicting earthquakes. However there are some recognizable trends, optimized in the recent years, which may give rise to more reliable and solid approaches to deal with this complex subject. In the light of these trends, emphasized by Aki, we try to analyze the new developments in the field, especially concerning the Vrancea region. (authors)

  12. Can Vrancea earthquakes be accurately predicted from unusual bio-system behavior and seismic-electromagnetic records?

    International Nuclear Information System (INIS)

    Enescu, D.; Chitaru, C.; Enescu, B.D.

    1999-01-01

    The relevance of bio-seismic research for the short-term prediction of strong Vrancea earthquakes is underscored. An unusual animal behavior before and during Vrancea earthquakes is described and illustrated in the individual case of the major earthquake of March 4, 1977. Several hypotheses to account for the uncommon behavior of bio-systems in relation to earthquakes in general and strong Vrancea earthquakes in particular are discussed in the second section. It is reminded that promising preliminary results concerning the identification of seismic-electromagnetic precursor signals have been obtained in the Vrancea seismogenic area using special, highly sensitive equipment. The need to correlate bio-seismic and seismic-electromagnetic researches is evident. Further investigations are suggested and urgent steps are proposed in order to achieve a successful short-term prediction of strong Vrancea earthquakes. (authors)

  13. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  14. Long-term impact of earthquakes on sleep quality.

    Science.gov (United States)

    Tempesta, Daniela; Curcio, Giuseppe; De Gennaro, Luigi; Ferrara, Michele

    2013-01-01

    We investigated the impact of the 6.3 magnitude 2009 L'Aquila (Italy) earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI) and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A) two years after the natural disaster. Self-reported sleep quality was assessed in 665 L'Aquila citizens exposed to the earthquake compared with a different sample (n = 754) of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB) of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574). The comparison between L'Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder.

  15. EPOS1 - a multiparameter measuring system to earthquake prediction research

    Energy Technology Data Exchange (ETDEWEB)

    Streil, T.; Oeser, V. [SARAD GmbH, Dresden (Germany); Heinicke, J.; Koch, U.; Wiegand, J.

    1998-12-31

    The approach to earthquake prediction by geophysical, geochemical and hydrological measurements is a long and winding road. Nevertheless, the results show a progress in that field (e.g. Kobe). This progress is also a result of a new generation of measuring equipment. SARAD has developed a versatile measuring system (EPOS1) based on experiences and recent results from different research groups. It is able to record selected parameters suitable to earthquake prediction research. A micro-computer system handles data exchange, data management and control. It is connected to a modular sensor system. Sensor modules can be selected according to the actual needs at the measuring site. (author)

  16. Failures and suggestions in Earthquake forecasting and prediction

    Science.gov (United States)

    Sacks, S. I.

    2013-12-01

    Seismologists have had poor success in earthquake prediction. However, wide ranging observations from earlier great earthquakes show that precursory data can exist. In particular, two aspects seem promising. In agreement with simple physical modeling, b-values decrease in highly loaded fault zones for years before failure. Potentially more usefully, in high stress regions, breakdown of dilatant patches leading to failure can yield expelled water-related observations. The volume increase (dilatancy) caused by high shear stresses decreases the pore pressure. Eventually, water flows back in restoring the pore pressure, promoting failure and expelling the extra water. Of course, in a generally stressed region there may be many small patches that fail, such as observed before the 1975 Haicheng earthquake. Only a few days before the major event will most of the dilatancy breakdown occur in the fault zone itself such as for the Tangshan, 1976 destructive event. Observations of 'water release' effects have been observed before the 1923 great Kanto earthquake, the 1984 Yamasaki event, the 1975 Haicheng and the 1976 Tangshan earthquakes and also the 1995 Kobe earthquake. While there are obvious difficulties in water release observations, not least because there is currently no observational network anywhere, historical data does suggest some promise if we broaden our approach to this difficult subject.

  17. Modelling earth current precursors in earthquake prediction

    Directory of Open Access Journals (Sweden)

    R. Di Maio

    1997-06-01

    Full Text Available This paper deals with the theory of earth current precursors of earthquake. A dilatancy-diffusion-polarization model is proposed to explain the anomalies of the electric potential, which are observed on the ground surface prior to some earthquakes. The electric polarization is believed to be the electrokinetic effect due to the invasion of fluids into new pores, which are opened inside a stressed-dilated rock body. The time and space variation of the distribution of the electric potential in a layered earth as well as in a faulted half-space is studied in detail. It results that the surface response depends on the underground conductivity distribution and on the relative disposition of the measuring dipole with respect to the buried bipole source. A field procedure based on the use of an areal layout of the recording sites is proposed, in order to obtain the most complete information on the time and space evolution of the precursory phenomena in any given seismic region.

  18. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  19. Flicker-noise Spectroscopy In Earthquake Prediction Research

    Science.gov (United States)

    Desherevsky, A. V.; Lukk, A. A.; Sidorin, A. Y.; Timashev, S. F.

    It has been found out that a two-component model including a seasonal and a flicker- noise components occurs to be a more adequate model of statistical structure of time series of long-term geophysical observations' data. Unlike a white noise which sig- nifies absence of any relation between the system's current dynamics and past events in it, presence of flicker-noise indicates that such a relation in the system does ex- ist. Flicker-noise pertains a property of scale invariance. It seems natural to relate self-similarity of statistical properties of geophysical parameters' variations on dif- ferent scales to self-similar (fractal) properties of geophysical medium. At the same time self-similar time variations of geophysical parameters may indicate to presence of deterministic chaos in geophysical system's evolution. An important element of a proposed approach is application of stochastic models of preparation of each concrete large seismic event. Instead of regular, for example bay-form precursor variations, occurrence of precursors of another kind associated in particular with variation in parameter fluctuations should be expected. To solve a problem of large earthquakes prediction we use Flicker-Noise Spectroscopy (FNS) as a basis of a new approach proposed by us. The basis of the FNS methodology is a postulate about the impor- tant information significance of sequences of various dynamic irregularities (bursts or spikes, jumps with different characteristic values, discontinuities of derivatives) of the measured temporal, spatial and energetic variables on each level of hierarchical orga- nization of studied systems. A proposed new method using integral values of analyzed signals - power spectra and different moments ("structural functions") of a different order as information relations, has demonstrated principally new opportunities in a search of large earthquake precursors already at a preliminary stage of some data analysis. This research was supported by

  20. Introduction: Long term prediction

    International Nuclear Information System (INIS)

    Beranger, G.

    2003-01-01

    Making a decision upon the right choice of a material appropriate to a given application should be based on taking into account several parameters as follows: cost, standards, regulations, safety, recycling, chemical properties, supplying, transformation, forming, assembly, mechanical and physical properties as well as the behaviour in practical conditions. Data taken from a private communication (J.H.Davidson) are reproduced presenting the life time range of materials from a couple of minutes to half a million hours corresponding to applications from missile technology up to high-temperature nuclear reactors or steam turbines. In the case of deep storage of nuclear waste the time required is completely different from these values since we have to ensure the integrity of the storage system for several thousand years. The vitrified nuclear wastes should be stored in metallic canisters made of iron and carbon steels, stainless steels, copper and copper alloys, nickel alloys or titanium alloys. Some of these materials are passivating metals, i.e. they develop a thin protective film, 2 or 3 nm thick - the so-called passive films. These films prevent general corrosion of the metal in a large range of chemical condition of the environment. In some specific condition, localized corrosion such as the phenomenon of pitting, occurs. Consequently, it is absolutely necessary to determine these chemical condition and their stability in time to understand the behavior of a given material. In other words the corrosion system is constituted by the complex material/surface/medium. For high level nuclear wastes the main features for resolving problem are concerned with: geological disposal; deep storage in clay; waste metallic canister; backfill mixture (clay-gypsum) or concrete; long term behavior; data needed for modelling and for predicting; choice of appropriate solution among several metallic candidates. The analysis of the complex material/surface/medium is of great importance

  1. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  2. Roles of Radon-222 and other natural radionuclides in earthquake prediction

    International Nuclear Information System (INIS)

    Smith, A.R.; Wollenberg, H.A.; Mosier, D.F.

    1980-01-01

    The concentration of 222 Rn in subsurface waters is one of the natural parameters being investigated to help develop the capability to predict destructive earthquakes. Since 1966, scientists in several nations have sought to link radon variations with ongoing seismic activity, primarily through the dilatancy model for earthquake occurrences. Within the range of these studies, alpha-, beta-, and gamma-radiation detection techniques have been used in both discrete-sampling and continiuous-monitoring programs. These measured techniques are reviewed in terms of instrumentation adapted to seismic-monitoring purposes. A recent Lawrence Berkeley Laboratory study conducted in central California incorporated discrete sampling of wells in the aftershock area of the 1975 Oroville earthquake and continuous monitoring of water radon in a well on the San Andreas Fault. The results presented show short-term radon variations that may be associated with aftershocks and diurnal changes that may reflect earth tidal forces

  3. Fault Branching and Long-Term Earthquake Rupture Scenario for Strike-Slip Earthquake

    Science.gov (United States)

    Klinger, Y.; CHOI, J. H.; Vallage, A.

    2017-12-01

    Careful examination of surface rupture for large continental strike-slip earthquakes reveals that for the majority of earthquakes, at least one major branch is involved in the rupture pattern. Often, branching might be either related to the location of the epicenter or located toward the end of the rupture, and possibly related to the stopping of the rupture. In this work, we examine large continental earthquakes that show significant branches at different scales and for which ground surface rupture has been mapped in great details. In each case, rupture conditions are described, including dynamic parameters, past earthquakes history, and regional stress orientation, to see if the dynamic stress field would a priori favor branching. In one case we show that rupture propagation and branching are directly impacted by preexisting geological structures. These structures serve as pathways for the rupture attempting to propagate out of its shear plane. At larger scale, we show that in some cases, rupturing a branch might be systematic, hampering possibilities for the development of a larger seismic rupture. Long-term geomorphology hints at the existence of a strong asperity in the zone where the rupture branched off the main fault. There, no evidence of throughgoing rupture could be seen along the main fault, while the branch is well connected to the main fault. This set of observations suggests that for specific configurations, some rupture scenarios involving systematic branching are more likely than others.

  4. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.; Mai, Paul Martin; Thingbaijam, Kiran Kumar; Razafindrakoto, H. N. T.; Genton, Marc G.

    2014-01-01

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  5. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  6. 75 FR 63854 - National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee

    Science.gov (United States)

    2010-10-18

    ... DEPARTMENT OF THE INTERIOR Geological Survey National Earthquake Prediction Evaluation Council...: Pursuant to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC) will hold a 2... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  7. Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

    KAUST Repository

    Sawlan, Zaid A

    2012-12-01

    Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent. Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters. This method reduces the error produced by uncertain inputs. In addition, state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.

  8. Time-predictable model applicability for earthquake occurrence in northeast India and vicinity

    Directory of Open Access Journals (Sweden)

    A. Panthi

    2011-03-01

    Full Text Available Northeast India and its vicinity is one of the seismically most active regions in the world, where a few large and several moderate earthquakes have occurred in the past. In this study the region of northeast India has been considered for an earthquake generation model using earthquake data as reported by earthquake catalogues National Geophysical Data Centre, National Earthquake Information Centre, United States Geological Survey and from book prepared by Gupta et al. (1986 for the period 1906–2008. The events having a surface wave magnitude of Ms≥5.5 were considered for statistical analysis. In this region, nineteen seismogenic sources were identified by the observation of clustering of earthquakes. It is observed that the time interval between the two consecutive mainshocks depends upon the preceding mainshock magnitude (Mp and not on the following mainshock (Mf. This result corroborates the validity of time-predictable model in northeast India and its adjoining regions. A linear relation between the logarithm of repeat time (T of two consecutive events and the magnitude of the preceding mainshock is established in the form LogT = cMp+a, where "c" is a positive slope of line and "a" is function of minimum magnitude of the earthquake considered. The values of the parameters "c" and "a" are estimated to be 0.21 and 0.35 in northeast India and its adjoining regions. The less value of c than the average implies that the earthquake occurrence in this region is different from those of plate boundaries. The result derived can be used for long term seismic hazard estimation in the delineated seismogenic regions.

  9. Monitoring of the future strong Vrancea events by using the CN formal earthquake prediction algorithm

    International Nuclear Information System (INIS)

    Moldoveanu, C.L.; Novikova, O.V.; Panza, G.F.; Radulian, M.

    2003-06-01

    The preparation process of the strong subcrustal events originating in Vrancea region, Romania, is monitored using an intermediate-term medium-range earthquake prediction method - the CN algorithm (Keilis-Borok and Rotwain, 1990). We present the results of the monitoring of the preparation of future strong earthquakes for the time interval from January 1, 1994 (1994.1.1), to January 1, 2003 (2003.1.1) using the updated catalogue of the Romanian local network. The database considered for the CN monitoring of the preparation of future strong earthquakes in Vrancea covers the period from 1966.3.1 to 2003.1.1 and the geographical rectangle 44.8 deg - 48.4 deg N, 25.0 deg - 28.0 deg E. The algorithm correctly identifies, by retrospective prediction, the TJPs for all the three strong earthquakes (Mo=6.4) that occurred in Vrancea during this period. The cumulated duration of the TIPs represents 26.5% of the total period of time considered (1966.3.1-2003.1.1). The monitoring of current seismicity using the algorithm CN has been carried out since 1994. No strong earthquakes occurred from 1994.1.1 to 2003.1.1 but the CN declared an extended false alarm from 1999.5.1 to 2000.11.1. No alarm has currently been declared in the region (on January 1, 2003), as can be seen from the TJPs diagram shown. (author)

  10. CN earthquake prediction algorithm and the monitoring of the future strong Vrancea events

    International Nuclear Information System (INIS)

    Moldoveanu, C.L.; Radulian, M.; Novikova, O.V.; Panza, G.F.

    2002-01-01

    The strong earthquakes originating at intermediate-depth in the Vrancea region (located in the SE corner of the highly bent Carpathian arc) represent one of the most important natural disasters able to induce heavy effects (high tool of casualties and extensive damage) in the Romanian territory. The occurrence of these earthquakes is irregular, but not infrequent. Their effects are felt over a large territory, from Central Europe to Moscow and from Greece to Scandinavia. The largest cultural and economical center exposed to the seismic risk due to the Vrancea earthquakes is Bucharest. This metropolitan area (230 km 2 wide) is characterized by the presence of 2.5 million inhabitants (10% of the country population) and by a considerable number of high-risk structures and infrastructures. The best way to face strong earthquakes is to mitigate the seismic risk by using the two possible complementary approaches represented by (a) the antiseismic design of structures and infrastructures (able to support strong earthquakes without significant damage), and (b) the strong earthquake prediction (in terms of alarm intervals declared for long, intermediate or short-term space-and time-windows). The intermediate term medium-range earthquake prediction represents the most realistic target to be reached at the present state of knowledge. The alarm declared in this case extends over a time window of about one year or more, and a space window of a few hundreds of kilometers. In the case of Vrancea events the spatial uncertainty is much less, being of about 100 km. The main measures for the mitigation of the seismic risk allowed by the intermediate-term medium-range prediction are: (a) verification of the buildings and infrastructures stability and reinforcement measures when required, (b) elaboration of emergency plans of action, (c) schedule of the main actions required in order to restore the normality of the social and economical life after the earthquake. The paper presents the

  11. The ordered network structure and prediction summary for M ≥ 7 earthquakes in Xinjiang region of China

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai

    2014-01-01

    M ≥ 7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a x k (k = 1, 2, 3), 11 ∝ 12 a, 41 ∝ 43 a, 18 ∝ 19 a, and 5 ∝ 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019-2020 and 2025-2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

  12. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  13. The use of radon gas techniques for earthquake prediction

    International Nuclear Information System (INIS)

    Al-Hilal, M.

    1993-01-01

    This scientific article explains the applications of radon gas measurements in water and soil for monitoring fault activities and earthquake prediction. It also emphasizes, through some worldwide examples presented from Tashkent Basin in U.S.S.R. and from San Andreas fault in U.S.A, that the use of radon gas technique in fault originated water as well as in soil gases can be considered as an important geological-tool, within the general framework of earthquake prediction because of the coherent and time anomalous relationship between the density of alpha particles due to radon decay and between the tectonic activity level along fault zones. The article also indicates, and through the practical experience of the author, to the possibility of applying such techniques in certain parts of Syria. (author). 6 refs., 4 figs

  14. Predicting Posttraumatic Stress Symptom Prevalence and Local Distribution after an Earthquake with Scarce Data.

    Science.gov (United States)

    Dussaillant, Francisca; Apablaza, Mauricio

    2017-08-01

    After a major earthquake, the assignment of scarce mental health emergency personnel to different geographic areas is crucial to the effective management of the crisis. The scarce information that is available in the aftermath of a disaster may be valuable in helping predict where are the populations that are in most need. The objectives of this study were to derive algorithms to predict posttraumatic stress (PTS) symptom prevalence and local distribution after an earthquake and to test whether there are algorithms that require few input data and are still reasonably predictive. A rich database of PTS symptoms, informed after Chile's 2010 earthquake and tsunami, was used. Several model specifications for the mean and centiles of the distribution of PTS symptoms, together with posttraumatic stress disorder (PTSD) prevalence, were estimated via linear and quantile regressions. The models varied in the set of covariates included. Adjusted R2 for the most liberal specifications (in terms of numbers of covariates included) ranged from 0.62 to 0.74, depending on the outcome. When only including peak ground acceleration (PGA), poverty rate, and household damage in linear and quadratic form, predictive capacity was still good (adjusted R2 from 0.59 to 0.67 were obtained). Information about local poverty, household damage, and PGA can be used as an aid to predict PTS symptom prevalence and local distribution after an earthquake. This can be of help to improve the assignment of mental health personnel to the affected localities. Dussaillant F , Apablaza M . Predicting posttraumatic stress symptom prevalence and local distribution after an earthquake with scarce data. Prehosp Disaster Med. 2017;32(4):357-367.

  15. Prediction of the occurrence of related strong earthquakes in Italy

    International Nuclear Information System (INIS)

    Vorobieva, I.A.; Panza, G.F.

    1993-06-01

    In the seismic flow it is often observed that a Strong Earthquake (SE), is followed by Related Strong Earthquakes (RSEs), which occur near the epicentre of the SE with origin time rather close to the origin time of the SE. The algorithm for the prediction of the occurrence of a RSE has been developed and applied for the first time to the seismicity data of the California-Nevada region and has been successfully tested in several regions of the World, the statistical significance of the result being 97%. So far, it has been possible to make five successful forward predictions, with no false alarms or failures to predict. The algorithm is applied here to the Italian territory, where the occurrence of RSEs is a particularly rare phenomenon. Our results show that the standard algorithm is successfully directly applicable without any adjustment of the parameters. Eleven SEs are considered. Of them, three are followed by a RSE, as predicted by the algorithm, eight SEs are not followed by a RSE, and the algorithm predicts this behaviour for seven of them, giving rise to only one false alarm. Since, in Italy, quite often the series of strong earthquakes are relatively short, the algorithm has been extended to handle such situation. The result of this experiment indicates that it is possible to attempt to test a SE, for the occurrence of a RSE, soon after the occurrence of the SE itself, performing timely ''preliminary'' recognition on reduced data sets. This fact, the high confidence level of the retrospective analysis, and the first successful forward predictions, made in different parts of the World, indicates that, even if additional tests are desirable, the algorithm can already be considered for routine application to Civil Defence. (author). Refs, 3 figs, 7 tabs

  16. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

    Science.gov (United States)

    Bell, Andrew F.

    2018-02-01

    Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

  17. Can radon gas measurements be used to predict earthquakes?

    International Nuclear Information System (INIS)

    2009-01-01

    After the tragic earthquake of April 6, 2009 in Aquila (Abruzzo), a debate has begun in Italy regarding the alleged prediction of this earthquake by a scientist working in the Gran Sasso National Laboratory, based on radon content measurements. Radon is a radioactive gas originating from the decay of natural radioactive elements present in the soil. IRSN specialists are actively involved in ongoing research projects on the impact of mechanical stresses on radon emissions from underground structures, and some of their results dating from several years ago are being brought up in this debate. These specialists are therefore currently presenting their perspective on the relationships between radon emissions and seismic activity, based on publications on the subject. (authors)

  18. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  19. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  20. Application of geochemical methods in earthquake prediction in China

    Energy Technology Data Exchange (ETDEWEB)

    Fong-liang, J.; Gui-ru, L.

    1981-05-01

    Several geochemical anomalies were observed before the Haichen, Longling, Tangshan, and Songpan earthquakes and their strong aftershocks. They included changes in groundwater radon levels; chemical composition of the groundwater (concentration of Ca/sup + +/, Mg/sup + +/, Cl/sup -/, So/sub 4//sup , and HCO/sub 3//sup -/ ions); conductivity; and dissolved gases such as H/sub 2/, CO/sub 2/, etc. In addition, anomalous changes in water color and quality were observed before these large earthquakes. Before some events gases escaped from the surface, and there were reports of ''ground odors'' being smelled by local residents. The large amount of radon data can be grouped into long-term and short-term anomalies. The long-term anomalies have a radon emission build up time of from a few months to more than a year. The short-term anomalies have durations from a few hours or less to a few months.

  1. Earthquake prediction research with plastic nuclear track detectors

    International Nuclear Information System (INIS)

    Woith, H.; Enge, W.; Beaujean, R.; Oschlies, K.

    1988-01-01

    Since 1984 a German-Turkish project on earthquake prediction research has been operating at the North Anatolian fault zone in Turkey. Among many other parameters changes in Radon emission have also been investigated. Plastic nuclear track detectors (Kodak cellulose nitrate LR 115) are used to record alpha-particles emitted from Radon and Thoron atoms and their daughter isotopes. The detectors are replaced and analyzed every 3 weeks. Thus a quasi-continuous time sequence of the Radon soil gas emission is recorded. We present a comparison between measurements made with electronic counters and plastic track detectors. (author)

  2. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    Science.gov (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  3. Load-Unload Response Ratio and Accelerating Moment/Energy Release Critical Region Scaling and Earthquake Prediction

    Science.gov (United States)

    Yin, X. C.; Mora, P.; Peng, K.; Wang, Y. C.; Weatherley, D.

    The main idea of the Load-Unload Response Ratio (LURR) is that when a system is stable, its response to loading corresponds to its response to unloading, whereas when the system is approaching an unstable state, the response to loading and unloading becomes quite different. High LURR values and observations of Accelerating Moment/Energy Release (AMR/AER) prior to large earthquakes have led different research groups to suggest intermediate-term earthquake prediction is possible and imply that the LURR and AMR/AER observations may have a similar physical origin. To study this possibility, we conducted a retrospective examination of several Australian and Chinese earthquakes with magnitudes ranging from 5.0 to 7.9, including Australia's deadly Newcastle earthquake and the devastating Tangshan earthquake. Both LURR values and best-fit power-law time-to-failure functions were computed using data within a range of distances from the epicenter. Like the best-fit power-law fits in AMR/AER, the LURR value was optimal using data within a certain epicentral distance implying a critical region for LURR. Furthermore, LURR critical region size scales with mainshock magnitude and is similar to the AMR/AER critical region size. These results suggest a common physical origin for both the AMR/AER and LURR observations. Further research may provide clues that yield an understanding of this mechanism and help lead to a solid foundation for intermediate-term earthquake prediction.

  4. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    NARCIS (Netherlands)

    Cheong, S.A.; Tan, T.L.; Chen, C.-C.; Chang, W.-L.; Liu, Z.; Chew, L.Y.; Sloot, P.M.A.; Johnson, N.F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting

  5. Radon monitoring and its application for earthquake prediction

    International Nuclear Information System (INIS)

    Ramchandran, T.V.; Shaikh, A.N.; Khan, A.H.; Mayya, Y.S.; Puranik, V.D.; Venkat Raj, V.

    2004-12-01

    Concentrations ofa wide range of terrestrial gases containing radionuclides like 222 Rn (Radon), H 2 (Hydrogen), Hg (Mercury), CO 2 (Carbon dioxide) and He 4 (Helium) in ground water and soil air have commonly been found to be anomalously high along active faults, suggesting that these faults may be the path for least resistance for the out gassing processes of the solid earth. Among the naturally occurring radionucludes, the 238 U decay series has received great attention in connection with the earthquake prediction and monitoring research all over the world. Due to its nearly ubiquitous occurrence, appreciable abundance, chemical inactivity and convenient half-life (3.823 d), 222 Rn in the 238 U series is the most extensively studied one in this regard. In this report, a brief account of the application of 222 Rn monitoring carried out all over the world, studies carried out in India, modeling of earthquake predictions, measurement techniques, measuring equipments, its availability in India, Indian radon monitoring programme and its prospects are presented. (author)

  6. Use of Kazakh nuclear explosions for testing dilatancy diffusion model of earthquake prediction

    International Nuclear Information System (INIS)

    Srivastava, H.N.

    1979-01-01

    P wave travel time anomalies from Kazakh explosions during the years 1965-1972 were studied with reference to Jeffreys Bullen (1952) and Herrin Travel time tables (1968) and discussed using F ratio test at seven stations in Himachal Pradesh. For these events, the temporal and spatial variations of travel time residuals were examined from the point of view of long term changes in velocity known to precede earthquakes and local geology. The results show perference for Herrin Travel time tables at these epicentral distances from Kazakh explosions. F ratio test indicated that variation between sample means of different stations in the network showed more variation than can be attributed to the sampling error. Although the spatial variation of mean residuals (1965-1972) could generally be explained on the basis of the local geology, the temporal variations of such residuals from Kazakh explosions offer limited application in the testing of dilatancy model of earthquake prediction. (auth.)

  7. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  8. WHY WE CANNOT PREDICT STRONG EARTHQUAKES IN THE EARTH’S CRUST

    Directory of Open Access Journals (Sweden)

    Iosif L. Gufeld

    2011-01-01

    needed to address the issues raised in this publication, including problems and possibilities of prediction of earthquakes in the crust. Incontrovertible achievements of the Earth sciences are reviewed, considering specific features of seismic events and variations of various parameters of the lithosphere, the block structure of the lithosphere and processes in the lithosphere. Much attention is given to analyses of driving forces of the seismotectonic process. The studies of variations of parameters of the medium, including rapid (hourly or daily changes, show that processes, that predetermine the state of stresses or the energy capacity of the medium (Figures 2 and 3 in the lithosphere, are overlooked. Analyses are based on processes of interactions between ascending flows of hydrogen and helium and the solid lithosphere. A consequence of such processes is gas porosity that controls many parameters of the medium and the oscillation regime of the threedimensional state of stresses of the block structures (Figures 6, 7, and 12, which impacts the dynamics of block movements. The endogenous activity of the lithosphere and its instability are controlled by degassing of light gases.The paper reviews processes of preparation for strong earthquakes in the crust with regard to the block structure of platform areas and subduction zones (Figures 13 and 14. It is demonstrated that the conventional methods yield ambiguous assessments of seismic hazard both in terms of time and locations of epicenter zones, and focal areas of subduction zones are out of control in principle. Processes that actually take place in the lithosphere are causes of such an ambiguity, i.e. the lack of any deterministic relations in development of critical seismotectonic situations. Methods for identification of the geological medium characterized by continuously variable parameters are considered. Directions of fundamental studies of the seismic process and principles of seismic activity monitoring are

  9. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    Science.gov (United States)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  10. Predicting Dynamic Response of Structures under Earthquake Loads Using Logical Analysis of Data

    Directory of Open Access Journals (Sweden)

    Ayman Abd-Elhamed

    2018-04-01

    Full Text Available In this paper, logical analysis of data (LAD is used to predict the seismic response of building structures employing the captured dynamic responses. In order to prepare the data, computational simulations using a single degree of freedom (SDOF building model under different ground motion records are carried out. The selected excitation records are real and of different peak ground accelerations (PGA. The sensitivity of the seismic response in terms of displacements of floors to the variation in earthquake characteristics, such as soil class, characteristic period, and time step of records, peak ground displacement, and peak ground velocity, have also been considered. The dynamic equation of motion describing the building model and the applied earthquake load are presented and solved incrementally using the Runge-Kutta method. LAD then finds the characteristic patterns which lead to forecast the seismic response of building structures. The accuracy of LAD is compared to that of an artificial neural network (ANN, since the latter is the most known machine learning technique. Based on the conducted study, the proposed LAD model has been proven to be an efficient technique to learn, simulate, and blindly predict the dynamic response behaviour of building structures subjected to earthquake loads.

  11. Prediction of site specific ground motion for large earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1990-01-01

    In this paper, we apply the semi-empirical synthesis method by IRIKURA (1983, 1986) to the estimation of site specific ground motion using accelerograms observed at Kumatori in Osaka prefecture. Target earthquakes used here are a comparatively distant earthquake (Δ=95 km, M=5.6) caused by the YAMASAKI fault and a near earthquake (Δ=27 km, M=5.6). The results obtained are as follows. 1) The accelerograms from the distant earthquake (M=5.6) are synthesized using the aftershock records (M=4.3) for 1983 YAMASAKI fault earthquake whose source parameters have been obtained by other authors from the hypocentral distribution of the aftershocks. The resultant synthetic motions show a good agreement with the observed ones. 2) The synthesis for a near earthquake (M=5.6, we call this target earthquake) are made using a small earthquake which occurred in the neighborhood of the target earthquake. Here, we apply two methods for giving the parameters for synthesis. One method is to use the parameters of YAMASAKI fault earthquake which has the same magnitude as the target earthquake, and the other is to use the parameters obtained from several existing empirical formulas. The resultant synthetic motion with the former parameters shows a good agreement with the observed one, but that with the latter does not. 3) We estimate the source parameters from the source spectra of several earthquakes which have been observed in this site. Consequently we find that the small earthquakes (M<4) as Green's functions should be carefully used because the stress drops are not constant. 4) We propose that we should designate not only the magnitudes but also seismic moments of the target earthquake and the small earthquake. (J.P.N.)

  12. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  13. An Earthquake Prediction System Using The Time Series Analyses of Earthquake Property And Crust Motion

    International Nuclear Information System (INIS)

    Takeda, Fumihide; Takeo, Makoto

    2004-01-01

    We have developed a short-term deterministic earthquake (EQ) forecasting system similar to those used for Typhoons and Hurricanes, which has been under a test operation at website http://www.tec21.jp/ since June of 2003. We use the focus and crust displacement data recently opened to the public by Japanese seismograph and global positioning system (GPS) networks, respectively. Our system divides the forecasting area into the five regional areas of Japan, each of which is about 5 deg. by 5 deg. We have found that it can forecast the focus, date of occurrence and magnitude (M) of an impending EQ (whose M is larger than about 6), all within narrow limits. We have two examples to describe the system. One is the 2003/09/26 EQ of M 8 in the Hokkaido area, which is of hindsight. Another is a successful rollout of the most recent forecast on the 2004/05/30 EQ of M 6.7 off coast of the southern Kanto (Tokyo) area

  14. A forecast experiment of earthquake activity in Japan under Collaboratory for the Study of Earthquake Predictability (CSEP)

    Science.gov (United States)

    Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.

    2012-04-01

    One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast

  15. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    Science.gov (United States)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  16. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    Science.gov (United States)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  17. Predicting earthquakes by analyzing accelerating precursory seismic activity

    Science.gov (United States)

    Varnes, D.J.

    1989-01-01

    During 11 sequences of earthquakes that in retrospect can be classed as foreshocks, the accelerating rate at which seismic moment is released follows, at least in part, a simple equation. This equation (1) is {Mathematical expression},where {Mathematical expression} is the cumulative sum until time, t, of the square roots of seismic moments of individual foreshocks computed from reported magnitudes;C and n are constants; and tfis a limiting time at which the rate of seismic moment accumulation becomes infinite. The possible time of a major foreshock or main shock, tf,is found by the best fit of equation (1), or its integral, to step-like plots of {Mathematical expression} versus time using successive estimates of tfin linearized regressions until the maximum coefficient of determination, r2,is obtained. Analyzed examples include sequences preceding earthquakes at Cremasta, Greece, 2/5/66; Haicheng, China 2/4/75; Oaxaca, Mexico, 11/29/78; Petatlan, Mexico, 3/14/79; and Central Chile, 3/3/85. In 29 estimates of main-shock time, made as the sequences developed, the errors in 20 were less than one-half and in 9 less than one tenth the time remaining between the time of the last data used and the main shock. Some precursory sequences, or parts of them, yield no solution. Two sequences appear to include in their first parts the aftershocks of a previous event; plots using the integral of equation (1) show that the sequences are easily separable into aftershock and foreshock segments. Synthetic seismic sequences of shocks at equal time intervals were constructed to follow equation (1), using four values of n. In each series the resulting distributions of magnitudes closely follow the linear Gutenberg-Richter relation log N=a-bM, and the product n times b for each series is the same constant. In various forms and for decades, equation (1) has been used successfully to predict failure times of stressed metals and ceramics, landslides in soil and rock slopes, and volcanic

  18. Long-term Postseismic Deformation Following the 1964 Alaska Earthquake

    Science.gov (United States)

    Freymueller, J. T.; Cohen, S. C.; Hreinsdöttir, S.; Suito, H.

    2003-12-01

    Geodetic data provide a rich data set describing the postseismic deformation that followed the 1964 Alaska earthquake (Mw 9.2). This is particularly true for vertical deformation, since tide gauges and leveling surveys provide extensive spatial coverage. Leveling was carried out over all of the major roads of Alaska in 1964-65, and over the last several years we have resurveyed an extensive data set using GPS. Along Turnagain Arm of Cook Inlet, south of Anchorage, a trench-normal profile was surveyed repeatedly over the first decade after the earthquake, and many of these sites have been surveyed with GPS. After using a geoid model to correct for the difference between geometric and orthometric heights, the leveling+GPS surveys reveal up to 1.25 meters of uplift since 1964. The largest uplifts are concentrated in the northern part of the Kenai Peninsula, SW of Turnagain Arm. In some places, steep gradients in the cumulative uplift measurements point to a very shallow source for the deformation. The average 1964-late 1990s uplift rates were substantially higher than the present-day uplift rates, which rarely exceed 10 mm/yr. Both leveling and tide gauge data document a decay in uplift rate over time as the postseismic signal decreases. However, even today the postseismic deformation represents a substantial portion of the total observe deformation signal, illustrating that very long-lived postseismic deformation is an important element of the subduction zone earthquake cycle for the very largest earthquakes. This is in contrast to much smaller events, such as M~8 earthquakes, for which postseismic deformation in many cases decays within a few years. This suggests that the very largest earthquakes may excite different processes than smaller events.

  19. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    Science.gov (United States)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  20. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  1. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  2. Long-term earthquake forecasts based on the epidemic-type aftershock sequence (ETAS model for short-term clustering

    Directory of Open Access Journals (Sweden)

    Jiancang Zhuang

    2012-07-01

    Full Text Available Based on the ETAS (epidemic-type aftershock sequence model, which is used for describing the features of short-term clustering of earthquake occurrence, this paper presents some theories and techniques related to evaluating the probability distribution of the maximum magnitude in a given space-time window, where the Gutenberg-Richter law for earthquake magnitude distribution cannot be directly applied. It is seen that the distribution of the maximum magnitude in a given space-time volume is determined in the longterm by the background seismicity rate and the magnitude distribution of the largest events in each earthquake cluster. The techniques introduced were applied to the seismicity in the Japan region in the period from 1926 to 2009. It was found that the regions most likely to have big earthquakes are along the Tohoku (northeastern Japan Arc and the Kuril Arc, both with much higher probabilities than the offshore Nankai and Tokai regions.

  3. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    Science.gov (United States)

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  4. [Medium- and long-term health effects of the L'Aquila earthquake (Central Italy, 2009) and of other earthquakes in high-income Countries: a systematic review].

    Science.gov (United States)

    Ripoll Gallardo, Alba; Alesina, Marta; Pacelli, Barbara; Serrone, Dario; Iacutone, Giovanni; Faggiano, Fabrizio; Della Corte, Francesco; Allara, Elias

    2016-01-01

    to compare the methodological characteristics of the studies investigating the middle- and long-term health effects of the L'Aquila earthquake with the features of studies conducted after other earthquakes occurred in highincome Countries. a systematic comparison between the studies which evaluated the health effects of the L'Aquila earthquake (Central Italy, 6th April 2009) and those conducted after other earthquakes occurred in comparable settings. Medline, Scopus, and 6 sources of grey literature were systematically searched. Inclusion criteria comprised measurement of health outcomes at least one month after the earthquake, investigation of earthquakes occurred in high-income Countries, and presence of at least one temporal or geographical control group. out of 2,976 titles, 13 studies regarding the L'Aquila earthquake and 51 studies concerning other earthquakes were included. The L'Aquila and the Kobe/Hanshin- Awaji (Japan, 17th January 1995) earthquakes were the most investigated. Studies on the L'Aquila earthquake had a median sample size of 1,240 subjects, a median duration of 24 months, and used most frequently a cross sectional design (7/13). Studies on other earthquakes had a median sample size of 320 subjects, a median duration of 15 months, and used most frequently a time series design (19/51). the L'Aquila studies often focussed on mental health, while the earthquake effects on mortality, cardiovascular outcomes, and health systems were less frequently evaluated. A more intensive use of routine data could benefit future epidemiological surveillance in the aftermath of earthquakes.

  5. Applications of the gambling score in evaluating earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  6. Damage Level Prediction of Reinforced Concrete Building Based on Earthquake Time History Using Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Suryanita Reni

    2017-01-01

    Full Text Available The strong motion earthquake could cause the building damage in case of the building not considered in the earthquake design of the building. The study aims to predict the damage-level of building due to earthquake using Artificial Neural Networks method. The building model is a reinforced concrete building with ten floors and height between floors is 3.6 m. The model building received a load of the earthquake based on nine earthquake time history records. Each time history scaled to 0,5g, 0,75g, and 1,0g. The Artificial Neural Networks are designed in 4 architectural models using the MATLAB program. Model 1 used the displacement, velocity, and acceleration as input and Model 2 used the displacement only as the input. Model 3 used the velocity as input, and Model 4 used the acceleration just as input. The output of the Neural Networks is the damage level of the building with the category of Safe (1, Immediate Occupancy (2, Life Safety (3 or in a condition of Collapse Prevention (4. According to the results, Neural Network models have the prediction rate of the damage level between 85%-95%. Therefore, one of the solutions for analyzing the structural responses and the damage level promptly and efficiently when the earthquake occurred is by using Artificial Neural Network

  7. A Trial for Earthquake Prediction by Precise Monitoring of Deep Ground Water Temperature

    Science.gov (United States)

    Nasuhara, Y.; Otsuki, K.; Yamauchi, T.

    2006-12-01

    A near future large earthquake is estimated to occur off Miyagi prefecture, northeast Japan within 20 years at a probability of about 80 %. In order to predict this earthquake, we have observed groundwater temperature in a borehole at Sendai city 100 km west of the asperity. This borehole penetrates the fault zone of NE-trending active reverse fault, Nagamachi-Rifu fault zone, at 820m depth. Our concept of the ground water observation is that fault zones are natural amplifier of crustal strain, and hence at 820m depth we set a very precise quartz temperature sensor with the resolution of 0.0002 deg. C. We confirmed our observation system to work normally by both the pumping up tests and the systematic temperature changes at different depths. Since the observation started on June 20 in 2004, we found mysterious intermittent temperature fluctuations of two types; one is of a period of 5-10 days and an amplitude of ca. 0.1 deg. C, and the other is of a period of 11-21 days and an amplitude of ca. 0.2 deg. C. Based on the examination using the product of Grashof number and Prantl number, natural convection of water can be occurred in the borehole. However, since these temperature fluctuations are observed only at the depth around 820 m, thus it is likely that they represent the hydrological natures proper to the Nagamachi-Rifu fault zone. It is noteworthy that the small temperature changes correlatable with earth tide are superposed on the long term and large amplitude fluctuations. The amplitude on the days of the full moon and new moon is ca. 0.001 deg. C. The bottoms of these temperature fluctuations always delay about 6 hours relative to peaks of earth tide. This is interpreted as that water in the borehole is sucked into the fault zone on which tensional normal stress acts on the days of the full moon and new moon. The amplitude of the crustal strain by earth tide was measured at ca. 2∗10^-8 strain near our observation site. High frequency temperature noise of

  8. Long-term effects of earthquake experience of young persons on cardiovascular disease risk factors

    Science.gov (United States)

    Li, Na; Wang, Yumei; Yu, Lulu; Song, Mei; Wang, Lan; Ji, Chunpeng

    2016-01-01

    Introduction The aim of the study was to study the long-term effect on cardiovascular disease risk factors of stress from direct experience of an earthquake as a young person. Material and methods We selected workers born between July 1, 1958 and July 1, 1976 who were examined at Kailuan General Hospital between May and October of 2013. Data on cardiovascular events were taken during the workers’ annual health examination conducted between 2006 and 2007. All subjects were divided into three groups according to their experience of the Tangshan earthquake of July 28, 1976, as follows: control group; exposed group 1 and exposed group 2. We compared cardiovascular disease risk factors between the three groups as well as by gender and age. Results One thousand one hundred and ninety-six workers were included in the final statistical analysis. Among all subjects, resting heart rate (p = 0.003), total cholesterol (p earthquake compared with unexposed controls, but were unrelated to loss of relatives. No significant difference in triglyceride levels was observed between the three groups (p = 0.900). Further refinement showed that the effects were restricted to males 40 years of age or older at the time of analysis, but were due primarily to age at the time of earthquake exposure (p = 0.002, p Earthquake experience in the early years of life has long-term effects on adult resting heart rate, total cholesterol, and fasting plasma glucose, especially among men. PMID:28144258

  9. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  10. Long-term perspectives on giant earthquakes and tsunamis at subduction zones

    Science.gov (United States)

    Satake, K.; Atwater, B.F.; ,

    2007-01-01

    Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.

  11. Radon/helium studies for earthquake prediction N-W Himalaya

    International Nuclear Information System (INIS)

    Virk, H.S.

    1999-01-01

    The paper presents the preliminary data of radon monitoring stated in the Himalayan orogenic belt. Radon anomalies are correlated with microseismic activity in the N-W Himalaya. The He/Rn ratio will be used as a predictive tool for earthquakes

  12. Geosphere Stability for long-term isolation of radioactive waste. Case study for hydrological change with earthquakes and faulting

    International Nuclear Information System (INIS)

    Niwa, Masakazu

    2016-01-01

    Appropriate estimation and safety assessment for long-term changes in geological environment are essential to an improvement of reliability for geological disposal. Specifically, study on faults is important for understanding regional groundwater flow as well as an assessment as a trigger of future earthquakes. Here, possibility of changes in permeability of faulted materials induced by earthquakes was examined based on monitoring data of groundwater pressure before and after the 2011 off the Pacific coast of Tohoku Earthquake. (author)

  13. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan

    OpenAIRE

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    Objective: To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. Methods: 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathe...

  14. My Road to Transform Faulting 1963; Long-Term Precursors to Recent Great Earthquakes

    Science.gov (United States)

    Sykes, L. R.

    2017-12-01

    My road to plate tectonics started serendipitously in 1963 in a remote area of the southeast Pacific when I was studying the propagation of short-period seismic surface waves for my PhD. The earthquakes I used as sources were poorly located. I discovered that my relocated epicenters followed the crest of the East Pacific Rise but then suddenly took a sharp turn to the east at what I interpreted to be a major fracture zone 1000 km long before turning again to the north near 55 degrees south. I noted that earthquakes along that zone only occurred between the two ridge crests, an observation Tuzo Wilson used to develop his hypothesis of transform faulting. Finding a great, unknown fracture zone led me to conclude that work on similar faults that intersect the Mid-Oceanic Ridge System was more important than my study of surface waves. I found similar great faults over the next two years and obtained refined locations of earthquakes along several island arcs. When I was in Fiji and Tonga during 1965 studying deep earthquakes, James Dorman wrote to me about Wilson's paper and I thought about testing his hypothesis. I started work on it the spring of 1966 immediately after I learned about the symmetrical "magic magnetic anomaly profile" across the East Pacific Rise of Pitman and Heirtzler. I quickly obtained earthquake mechanisms that verified the transform hypothesis and its related concepts of seafloor spreading and continental drift. As an undergraduate in the late 1950s, my mentor told me that respectable young earth scientists should not work on vague and false mobilistic concepts like continental drift since continents cannot plow through strong oceanic crust. Hence, until spring 1966, I did not take continental drift seriously. The second part of my presentation involves new evidence from seismology and GPS of what appear to be long-term precursors to a number of great earthquakes of the past decade.

  15. A global earthquake discrimination scheme to optimize ground-motion prediction equation selection

    Science.gov (United States)

    Garcia, Daniel; Wald, David J.; Hearne, Michael

    2012-01-01

    We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs.

  16. Development of compact long-term broadband ocean bottom seismometer for seafloor observation of slow earthquakes

    Science.gov (United States)

    Yamashita, Y.; Shinohara, M.; Yamada, T.; Shiobara, H.

    2017-12-01

    It is important to understand coupling between plates in a subduction zone for studies of earthquake generation. Recently low frequency tremor and very low frequency earthquake (VLFE) were discovered in plate boundary near a trench. These events (slow earthquakes) in shallow plate boundary should be related to slow slip on a plate boundary. For observation of slow earthquakes, Broad Band Ocean Bottom Seismometer (BBOBS) is useful, however a number of BBOBSs are limited due to cost. On the other hand, a number of Long-term OBSs (LT-OBSs) with recording period of one year are available. However, the LT-OBS has seismometer with a natural period of 1 second. Therefore frequency band of observation is slightly narrow for slow earthquakes. Therefore we developed a compact long-term broad-band OBS by replacement of the seismic sensor of the LT-OBSs to broadband seismometer.We adopted seismic sensor with natural period of 20 seconds (Trillium Compact Broadband Seismometer, Nanometrics). Because tilt of OBS on seafloor can not be controlled due to free-fall, leveling system for seismic sensor is necessary. The broadband seismic senor has cylinder shape with diameter of 90 mm and height of 100 mm, and the developed levelling system can mount the seismic sensor with no modification of shape. The levelling system has diameter of 160 mm and height of 110 mm, which is the same size as existing levelling system of the LT-OBS. The levelling system has two horizontal axes and each axis is driven by motor. Leveling can be performed up to 20 degrees by using micro-processor (Arduino). Resolution of levelling is less than one degree. The system immediately starts leveling by the power-on of controller. After levelling, the the seismic senor is powered and the controller records angles of levelling to SD RAM. Then the controller is shut down to consume no power. Compact long-term broadband ocean bottom seismometer is useful for observation of slow earthquakes on seafloor. In addition

  17. Real-time numerical shake prediction and updating for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan

    2017-12-01

    Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.

  18. Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

    KAUST Repository

    Sawlan, Zaid A

    2012-01-01

    parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while

  19. Prediction of long-term creep curves

    International Nuclear Information System (INIS)

    Oikawa, Hiroshi; Maruyama, Kouichi

    1992-01-01

    This paper aims at discussing how to predict long-term irradiation enhanced creep properties from short-term tests. The predictive method based on the θ concept was examined by using creep data of ferritic steels. The method was successful in predicting creep curves including the tertiary creep stage as well as rupture lifetimes. Some material constants involved in the method are insensitive to the irradiation environment, and their values obtained in thermal creep are applicable to irradiation enhanced creep. The creep mechanisms of most engineering materials definitely change at the athermal yield stress in the non-creep regime. One should be aware that short-term tests must be carried out at stresses lower than the athermal yield stress in order to predict the creep behavior of structural components correctly. (orig.)

  20. Predicted Attenuation Relation and Observed Ground Motion of Gorkha Nepal Earthquake of 25 April 2015

    Science.gov (United States)

    Singh, R. P.; Ahmad, R.

    2015-12-01

    A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters.

  1. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  2. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan.

    Science.gov (United States)

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathered as the control group who have not exposed to any earthquake. The numbers of subjects who were first identified with diabetes or had normal FBG but with diabetic history were added for the calculation of diabetes prevalence. Statistic-analysis was applied on the baseline data, and incidences of IFG as well as diabetes among all groups. Statistic comparisons indicate there is no significant difference on average fasting glucose levels between the control group and the exposure group. However, the prevalence of IFG and diabetes among the exposure group displays significant variance with the control group. The prevalence of diabetes among exposure groups is significantly higher than the control group. Women are more likely to have diabetes after experiencing earthquake stresses compared to men. The earthquake stress was linked to higher diabetes incidence as an independent factor. The earthquake stress has long-term impacts on diabetes incidence as an independent risk factor. Emerging and long-term managements regarding the care of IFG and diabetes in populations exposed to earthquake stress should be concerned.

  3. Recent Achievements of the Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Jackson, D. D.; Liukis, M.; Werner, M. J.; Schorlemmer, D.; Yu, J.; Maechling, P. J.; Zechar, J. D.; Jordan, T. H.

    2015-12-01

    Maria Liukis, SCEC, USC; Maximilian Werner, University of Bristol; Danijel Schorlemmer, GFZ Potsdam; John Yu, SCEC, USC; Philip Maechling, SCEC, USC; Jeremy Zechar, Swiss Seismological Service, ETH; Thomas H. Jordan, SCEC, USC, and the CSEP Working Group The Collaboratory for the Study of Earthquake Predictability (CSEP) supports a global program to conduct prospective earthquake forecasting experiments. CSEP testing centers are now operational in California, New Zealand, Japan, China, and Europe with 435 models under evaluation. The California testing center, operated by SCEC, has been operational since Sept 1, 2007, and currently hosts 30-minute, 1-day, 3-month, 1-year and 5-year forecasts, both alarm-based and probabilistic, for California, the Western Pacific, and worldwide. We have reduced testing latency, implemented prototype evaluation of M8 forecasts, and are currently developing formats and procedures to evaluate externally-hosted forecasts and predictions. These efforts are related to CSEP support of the USGS program in operational earthquake forecasting and a DHS project to register and test external forecast procedures from experts outside seismology. A retrospective experiment for the 2010-2012 Canterbury earthquake sequence has been completed, and the results indicate that some physics-based and hybrid models outperform purely statistical (e.g., ETAS) models. The experiment also demonstrates the power of the CSEP cyberinfrastructure for retrospective testing. Our current development includes evaluation strategies that increase computational efficiency for high-resolution global experiments, such as the evaluation of the Global Earthquake Activity Rate (GEAR) model. We describe the open-source CSEP software that is available to researchers as they develop their forecast models (http://northridge.usc.edu/trac/csep/wiki/MiniCSEP). We also discuss applications of CSEP infrastructure to geodetic transient detection and how CSEP procedures are being

  4. Implementation of short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L; Joensen, A; Giebel, G [and others

    1999-03-01

    This paper will giver a general overview of the results from a EU JOULE funded project (`Implementing short-term prediction at utilities`, JOR3-CT95-0008). Reference will be given to specialised papers where applicable. The goal of the project was to implement wind farm power output prediction systems in operational environments at a number of utilities in Europe. Two models were developed, one by Risoe and one by the Technical University of Denmark (DTU). Both prediction models used HIRLAM predictions from the Danish Meteorological Institute (DMI). (au) EFP-94; EU-JOULE. 11 refs.

  5. CSEP-Japan: The Japanese node of the collaboratory for the study of earthquake predictability

    Science.gov (United States)

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2011-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project of earthquake predictability research. The final goal of this project is to have a look for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined the CSEP and started the Japanese testing center called as CSEP-Japan. This testing center constitutes an open access to researchers contributing earthquake forecast models for applied to Japan. A total of 91 earthquake forecast models were submitted on the prospective experiment starting from 1 November 2009. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by the CSEP. The experiments of 1-day, 3-month, 1-year and 3-year forecasting classes were implemented for 92 rounds, 4 rounds, 1round and 0 round (now in progress), respectively. The results of the 3-month class gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space-distribution with most models in some cases where many earthquakes occurred at the same spot. Throughout the experiment, it has been clarified that some properties of the CSEP's evaluation tests such as the L-test show strong correlation with the N-test. We are now processing to own (cyber-) infrastructure to support the forecast experiment as follows. (1) Japanese seismicity has changed since the 2011 Tohoku earthquake. The 3rd call for forecasting models was announced in order to promote model improvement for forecasting earthquakes after this earthquake. So, we provide Japanese seismicity catalog maintained by JMA for modelers to study how seismicity

  6. A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta

    Science.gov (United States)

    Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.

    2015-12-01

    Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.

  7. Earthquake prediction in California using regression algorithms and cloud-based big data infrastructure

    Science.gov (United States)

    Asencio-Cortés, G.; Morales-Esteban, A.; Shang, X.; Martínez-Álvarez, F.

    2018-06-01

    Earthquake magnitude prediction is a challenging problem that has been widely studied during the last decades. Statistical, geophysical and machine learning approaches can be found in literature, with no particularly satisfactory results. In recent years, powerful computational techniques to analyze big data have emerged, making possible the analysis of massive datasets. These new methods make use of physical resources like cloud based architectures. California is known for being one of the regions with highest seismic activity in the world and many data are available. In this work, the use of several regression algorithms combined with ensemble learning is explored in the context of big data (1 GB catalog is used), in order to predict earthquakes magnitude within the next seven days. Apache Spark framework, H2 O library in R language and Amazon cloud infrastructure were been used, reporting very promising results.

  8. Inelastic spectra to predict period elongation of structures under earthquake loading

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sextos, A.G.

    2015-01-01

    Period lengthening, exhibited by structures when subjected to strong ground motions, constitutes an implicit proxy of structural inelasticity and associated damage. However, the reliable prediction of the inelastic period is tedious and a multi-parametric task, which is related to both epistemic ...... for period lengthening as a function of Ry and Tel. These equations may be used in the framework of the earthquake record selection and scaling....

  9. Comparison of Ground Motion Prediction Equations (GMPE) for Chile and Canada With Recent Chilean Megathust Earthquakes

    Science.gov (United States)

    Herrera, C.; Cassidy, J. F.; Dosso, S. E.

    2017-12-01

    The ground shaking assessment allows quantifying the hazards associated with the occurrence of earthquakes. Chile and western Canada are two areas that have experienced, and are susceptible to imminent large crustal, in-slab and megathrust earthquakes that can affect the population significantly. In this context, we compare the current GMPEs used in the 2015 National Building Code of Canada and the most recent GMPEs calculated for Chile, with observed accelerations generated by four recent Chilean megathrust earthquakes (MW ≥ 7.7) that have occurred during the past decade, which is essential to quantify how well current models predict observations of major events.We collected the 3-component waveform data of more than 90 stations from the Centro Sismologico Nacional and the Universidad de Chile, and processed them by removing the trend and applying a band-pass filter. Then, for each station, we obtained the Peak Ground Acceleration (PGA), and by using a damped response spectra, we calculated the Pseudo Spectral Acceleration (PSA). Finally, we compared those observations with the most recent Chilean and Canadian GMPEs. Given the lack of geotechnical information for most of the Chilean stations, we also used a new method to obtain the VS30 by inverting the H/V ratios using a trans-dimensional Bayesian inversion, which allows us to improve the correction of observations according to soil conditions.As expected, our results show a good fit between observations and the Chilean GMPEs, but we observe that although the shape of the Canadian GMPEs is coherent with the distribution of observations, in general they under predict the observations for PGA and PSA at shorter periods for most of the considered earthquakes. An example of this can be seen in the attached figure for the case of the 2014 Iquique earthquake.These results present important implications related to the hazards associated to large earthquakes, especially for western Canada, where the probability of a

  10. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    Science.gov (United States)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  11. Long-term change of activity of very low-frequency earthquakes in southwest Japan

    Science.gov (United States)

    Baba, S.; Takeo, A.; Obara, K.; Kato, A.; Maeda, T.; Matsuzawa, T.

    2017-12-01

    On plate interface near seismogenic zone of megathrust earthquakes, various types of slow earthquakes were detected including non-volcanic tremors, slow slip events (SSEs) and very low-frequency earthquakes (VLFEs). VLFEs are classified into deep VLFEs, which occur in the downdip side of the seismogenic zone, and shallow VLFEs, occur in the updip side, i.e. several kilometers in depth in southwest Japan. As a member of slow earthquake family, VLFE activity is expected to be a proxy of inter-plate slipping because VLFEs have the same mechanisms as inter-plate slipping and are detected during Episodic tremor and slip (ETS). However, long-term change of the VLFE seismicity has not been well constrained compared to deep low-frequency tremor. We thus studied long-term changes in the activity of VLFEs in southwest Japan where ETS and long-term SSEs have been most intensive. We used continuous seismograms of F-net broadband seismometers operated by NIED from April 2004 to March 2017. After applying the band-pass filter with a frequency range of 0.02—0.05 Hz, we adopted the matched-filter technique in detecting VLFEs. We prepared templates by calculating synthetic waveforms for each hypocenter grid assuming typical focal mechanisms of VLFEs. The correlation coefficients between templates and continuous F-net seismograms were calculated at each grid every 1s in all components. The grid interval is 0.1 degree for both longitude and latitude. Each VLFE was detected as an event if the average of correlation coefficients exceeds the threshold. We defined the detection threshold as eight times as large as the median absolute deviation of the distribution. At grids in the Bungo channel, where long-term SSEs occurred frequently, the cumulative number of detected VLFEs increases rapidly in 2010 and 2014, which were modulated by stress loading from the long-term SSEs. At inland grids near the Bungo channel, the cumulative number increases steeply every half a year. This stepwise

  12. Near-fault earthquake ground motion prediction by a high-performance spectral element numerical code

    International Nuclear Information System (INIS)

    Paolucci, Roberto; Stupazzini, Marco

    2008-01-01

    Near-fault effects have been widely recognised to produce specific features of earthquake ground motion, that cannot be reliably predicted by 1D seismic wave propagation modelling, used as a standard in engineering applications. These features may have a relevant impact on the structural response, especially in the nonlinear range, that is hard to predict and to be put in a design format, due to the scarcity of significant earthquake records and of reliable numerical simulations. In this contribution a pilot study is presented for the evaluation of seismic ground-motions in the near-fault region, based on a high-performance numerical code for 3D seismic wave propagation analyses, including the seismic fault, the wave propagation path and the near-surface geological or topographical irregularity. For this purpose, the software package GeoELSE is adopted, based on the spectral element method. The set-up of the numerical benchmark of 3D ground motion simulation in the valley of Grenoble (French Alps) is chosen to study the effect of the complex interaction between basin geometry and radiation mechanism on the variability of earthquake ground motion

  13. Analysis methods for predicting the behaviour of isolators and formulation of simplified models for use in predicting response of structures to earthquake type input

    International Nuclear Information System (INIS)

    2002-01-01

    This report describes the simplified models for predicting the response of high-damping natural rubber bearings (HDNRB) to earthquake ground motions and benchmark problems for assessing the accuracy of finite element analyses in designing base-isolators. (author)

  14. Earthquake prediction using extinct monogenetic volcanoes: A possible new research strategy

    Science.gov (United States)

    Szakács, Alexandru

    2011-04-01

    Volcanoes are extremely effective transmitters of matter, energy and information from the deep Earth towards its surface. Their capacities as information carriers are far to be fully exploited so far. Volcanic conduits can be viewed in general as rod-like or sheet-like vertical features with relatively homogenous composition and structure crosscutting geological structures of far more complexity and compositional heterogeneity. Information-carrying signals such as earthquake precursor signals originating deep below the Earth surface are transmitted with much less loss of information through homogenous vertically extended structures than through the horizontally segmented heterogeneous lithosphere or crust. Volcanic conduits can thus be viewed as upside-down "antennas" or waveguides which can be used as privileged pathways of any possible earthquake precursor signal. In particular, conduits of monogenetic volcanoes are promising transmitters of deep Earth information to be received and decoded at surface monitoring stations because the expected more homogenous nature of their rock-fill as compared to polygenetic volcanoes. Among monogenetic volcanoes those with dominantly effusive activity appear as the best candidates for privileged earthquake monitoring sites. In more details, effusive monogenetic volcanic conduits filled with rocks of primitive parental magma composition indicating direct ascent from sub-lithospheric magma-generating areas are the most suitable. Further selection criteria may include age of the volcanism considered and the presence of mantle xenoliths in surface volcanic products indicating direct and straightforward link between the deep lithospheric mantle and surface through the conduit. Innovative earthquake prediction research strategies can be based and developed on these grounds by considering conduits of selected extinct monogenetic volcanoes and deep trans-crustal fractures as privileged emplacement sites of seismic monitoring stations

  15. State Vector: A New Approach to Prediction of the Failure of Brittle Heterogeneous Media and Large Earthquakes

    Science.gov (United States)

    Yu, Huai-Zhong; Yin, Xiang-Chu; Zhu, Qing-Yong; Yan, Yu-Ding

    2006-12-01

    The concept of state vector stems from statistical physics, where it is usually used to describe activity patterns of a physical field in its manner of coarsegrain. In this paper, we propose an approach by which the state vector was applied to describe quantitatively the damage evolution of the brittle heterogeneous systems, and some interesting results are presented, i.e., prior to the macro-fracture of rock specimens and occurrence of a strong earthquake, evolutions of the four relevant scalars time series derived from the state vectors changed anomalously. As retrospective studies, some prominent large earthquakes occurred in the Chinese Mainland (e.g., the M 7.4 Haicheng earthquake on February 4, 1975, and the M 7.8 Tangshan earthquake on July 28, 1976, etc) were investigated. Results show considerable promise that the time-dependent state vectors could serve as a kind of precursor to predict earthquakes.

  16. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters

    Science.gov (United States)

    Marc, Odin; Meunier, Patrick; Hovius, Niels

    2017-07-01

    We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

  17. Real-time 3-D space numerical shake prediction for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang

    2017-12-01

    In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model.

  18. Intermediate-term middle-range predictions in Italy: a review

    International Nuclear Information System (INIS)

    Peresan, A.; Kossobokov, V.; Romashkova, L.; Panza, G.F.

    2003-11-01

    The Italian territory has been object of several studies devoted to the analysis of seismicity and to earthquake precursors' research. Although a number of observations have been claimed to precede large earthquakes, only few systematic studies have been carried out and almost no test of their performances is available up to now. In this paper we review the application to the Italian territory of two formally defined intermediate-term middle-range earthquake prediction algorithms, namely CN and M8S. The general methodology common to the two different algorithms makes use of general concepts of pattern recognition that permit to deal with multiple sets of seismic precursors, and allows for a systematic monitoring of seismicity, as well as for a widespread testing of the prediction performances. Italy represents the only region of moderate seismic activity where the M8S and CN algorithms are applied simultaneously for the routine monitoring. Significant efforts have been made to minimize the intrinsic space uncertainty of predictions and the subjectivity of the definition of the areas where precursors should be identified. Several experiments have been dedicated to assess the robustness of the methodology against the unavoidable uncertainties in the data. With these results acquired, predictions are routinely issued by CN algorithm, since January 1998, and by M8S algorithm, since January 2002. Starting in July 2003 an experiment has been launched for the real-time test of M8S and CN predictions. (author)

  19. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    Science.gov (United States)

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  20. Prediction of accident sequence probabilities in a nuclear power plant due to earthquake events

    International Nuclear Information System (INIS)

    Hudson, J.M.; Collins, J.D.

    1980-01-01

    This paper presents a methodology to predict accident probabilities in nuclear power plants subject to earthquakes. The resulting computer program accesses response data to compute component failure probabilities using fragility functions. Using logical failure definitions for systems, and the calculated component failure probabilities, initiating event and safety system failure probabilities are synthesized. The incorporation of accident sequence expressions allows the calculation of terminal event probabilities. Accident sequences, with their occurrence probabilities, are finally coupled to a specific release category. A unique aspect of the methodology is an analytical procedure for calculating top event probabilities based on the correlated failure of primary events

  1. Prediction of strong ground motion based on scaling law of earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1991-01-01

    In order to predict more practically strong ground motion, it is important to study how to use a semi-empirical method in case of having no appropriate observation records for actual small-events as empirical Green's functions. We propose a prediction procedure using artificially simulated small ground motions as substitute for the actual motions. First, we simulate small-event motion by means of stochastic simulation method proposed by Boore (1983) in considering pass effects such as attenuation, and broadening of waveform envelope empirically in the objective region. Finally, we attempt to predict the strong ground motion due to a future large earthquake (M 7, Δ = 13 km) using the same summation procedure as the empirical Green's function method. We obtained the results that the characteristics of the synthetic motion using M 5 motion were in good agreement with those by the empirical Green's function method. (author)

  2. Earthquake Forecasting Methodology Catalogue - A collection and comparison of the state-of-the-art in earthquake forecasting and prediction methodologies

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2015-04-01

    Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.

  3. Hanford grout: predicting long-term performance

    International Nuclear Information System (INIS)

    Sewart, G.H.; Mitchell, D.H.; Treat, R.L.; McMakin, A.H.

    1987-01-01

    Grouted disposal is being planned for the low-level portion of liquid radioactive wastes at the Hanford site in Washington state. The performance of the disposal system must be such that it will protect people and the environment for thousands of years after disposal. To predict whether a specific grout disposal system will comply with existing and foreseen regulations, a performance assessment (PA) is performed. Long-term PAs are conducted for a range of performance conditions. Performance assessment is an inexact science. Quantifying projected impacts is especially difficult when only scant data exist on the behavior of certain components of the disposal system over thousands of years. To develop defensible results, we are honing the models and obtaining experimental data. The combination of engineered features and PA refinements is being used to ensure that Hanford grout will meet its principal goal: to protect people and the environment in the future

  4. The ordered network structure of M {>=} 6 strong earthquakes and its prediction in the Jiangsu-South Yellow Sea region

    Energy Technology Data Exchange (ETDEWEB)

    Men, Ke-Pei [Nanjing Univ. of Information Science and Technology (China). College of Mathematics and Statistics; Cui, Lei [California Univ., Santa Barbara, CA (United States). Applied Probability and Statistics Dept.

    2013-05-15

    The the Jiangsu-South Yellow Sea region is one of the key seismic monitoring defence areas in the eastern part of China. Since 1846, M {>=} 6 strong earthquakes have showed an obvious commensurability and orderliness in this region. The main orderly values are 74 {proportional_to} 75 a, 57 {proportional_to} 58 a, 11 {proportional_to} 12 a, and 5 {proportional_to} 6 a, wherein 74 {proportional_to} 75 a and 57 {proportional_to} 58 a with an outstanding predictive role. According to the information prediction theory of Wen-Bo Weng, we conceived the M {>=} 6 strong earthquake ordered network structure in the South Yellow Sea and the whole region. Based on this, we analyzed and discussed the variation of seismicity in detail and also made a trend prediction of M {>=} 6 strong earthquakes in the future. The results showed that since 1998 it has entered into a new quiet episode which may continue until about 2042; and the first M {>=} 6 strong earthquake in the next active episode will probably occur in 2053 pre and post, with the location likely in the sea area of the South Yellow Sea; also, the second and the third ones or strong earthquake swarm in the future will probably occur in 2058 and 2070 pre and post. (orig.)

  5. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Science.gov (United States)

    Borcherdt, Roger D.; Gibbs, James F.

    1975-01-01

    The intensity data for the California earthquake of April 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan Formation is: Intensity = 2.69 - 1.90 log (Distance) (km). For sites on other geologic units intensity increments, derived with respect to this empirical relation, correlate strongly with the Average Horizontal Spectral Amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is: Intensity Increment = 0.27 +2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan Formation, 0.64 for the Great Valley Sequence, 0.82 for Santa Clara Formation, 1.34 for alluvium, 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hazard fault.

  6. Prediction of maximum earthquake intensities for the San Francisco Bay region

    Energy Technology Data Exchange (ETDEWEB)

    Borcherdt, R.D.; Gibbs, J.F.

    1975-01-01

    The intensity data for the California earthquake of Apr 18, 1906, are strongly dependent on distance from the zone of surface faulting and the geological character of the ground. Considering only those sites (approximately one square city block in size) for which there is good evidence for the degree of ascribed intensity, the empirical relation derived between 1906 intensities and distance perpendicular to the fault for 917 sites underlain by rocks of the Franciscan formation is intensity = 2.69 - 1.90 log (distance) (km). For sites on other geologic units, intensity increments, derived with respect to this empirical relation, correlate strongly with the average horizontal spectral amplifications (AHSA) determined from 99 three-component recordings of ground motion generated by nuclear explosions in Nevada. The resulting empirical relation is intensity increment = 0.27 + 2.70 log (AHSA), and average intensity increments for the various geologic units are -0.29 for granite, 0.19 for Franciscan formation, 0.64 for the Great Valley sequence, 0.82 for Santa Clara formation, 1.34 for alluvium, and 2.43 for bay mud. The maximum intensity map predicted from these empirical relations delineates areas in the San Francisco Bay region of potentially high intensity from future earthquakes on either the San Andreas fault or the Hayward fault.

  7. Moment Magnitudes and Local Magnitudes for Small Earthquakes: Implications for Ground-Motion Prediction and b-values

    Science.gov (United States)

    Baltay, A.; Hanks, T. C.; Vernon, F.

    2016-12-01

    We illustrate two essential consequences of the systematic difference between moment magnitude and local magnitude for small earthquakes, illuminating the underlying earthquake physics. Moment magnitude, M 2/3 log M0, is uniformly valid for all earthquake sizes [Hanks and Kanamori, 1979]. However, the relationship between local magnitude ML and moment is itself magnitude dependent. For moderate events, 3> fmax. Just as importantly, if this relation is overlooked, prediction of large-magnitude ground motion from small earthquakes will be misguided. We also consider the effect of this magnitude scale difference on b-value. The oft-cited b-value of 1 should hold for small magnitudes, given M. Use of ML necessitates b=2/3 for the same data set; use of mixed, or unknown, magnitudes complicates the matter further. This is of particular import when estimating the rate of large earthquakes when one has limited data on their recurrence, as is the case for induced earthquakes in the central US.

  8. The effects of spatially varying earthquake impacts on mood and anxiety symptom treatments among long-term Christchurch residents following the 2010/11 Canterbury earthquakes, New Zealand.

    Science.gov (United States)

    Hogg, Daniel; Kingham, Simon; Wilson, Thomas M; Ardagh, Michael

    2016-09-01

    This study investigates the effects of disruptions to different community environments, community resilience and cumulated felt earthquake intensities on yearly mood and anxiety symptom treatments from the New Zealand Ministry of Health's administrative databases between September 2009 and August 2012. The sample includes 172,284 long-term residents from different Christchurch communities. Living in a better physical environment was associated with lower mood and anxiety treatment rates after the beginning of the Canterbury earthquake sequence whereas an inverse effect could be found for social community environment and community resilience. These results may be confounded by pre-existing patterns, as well as intensified treatment-seeking behaviour and intervention programmes in severely affected areas. Nevertheless, the findings indicate that adverse mental health outcomes can be found in communities with worse physical but stronger social environments or community resilience post-disaster. Also, they do not necessarily follow felt intensities since cumulative earthquake intensity did not show a significant effect. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. The potential of continuous, local atomic clock measurements for earthquake prediction and volcanology

    Directory of Open Access Journals (Sweden)

    Bondarescu Mihai

    2015-01-01

    Full Text Available Modern optical atomic clocks along with the optical fiber technology currently being developed can measure the geoid, which is the equipotential surface that extends the mean sea level on continents, to a precision that competes with existing technology. In this proceeding, we point out that atomic clocks have the potential to not only map the sea level surface on continents, but also look at variations of the geoid as a function of time with unprecedented timing resolution. The local time series of the geoid has a plethora of applications. These include potential improvement in the predictions of earthquakes and volcanoes, and closer monitoring of ground uplift in areas where hydraulic fracturing is performed.

  10. Earthquake Prediction Analysis Based on Empirical Seismic Rate: The M8 Algorithm

    International Nuclear Information System (INIS)

    Molchan, G.; Romashkova, L.

    2010-07-01

    The quality of space-time earthquake prediction is usually characterized by a two-dimensional error diagram (n,τ), where n is the rate of failures-to-predict and τ is the normalized measure of space-time alarm. The most reasonable space measure for analysis of a prediction strategy is the rate of target events λ(dg) in a sub-area dg. In that case the quantity H = 1-(n +τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n,τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M ≥ 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw ≥ 5.5, 1977-2004, and the magnitude range of target events 8.0 ≤ M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm. (author)

  11. Earthquake prediction analysis based on empirical seismic rate: the M8 algorithm

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2010-12-01

    The quality of space-time earthquake prediction is usually characterized by a 2-D error diagram (n, τ), where n is the fraction of failures-to-predict and τ is the local rate of alarm averaged in space. The most reasonable averaging measure for analysis of a prediction strategy is the normalized rate of target events λ(dg) in a subarea dg. In that case the quantity H = 1 - (n + τ) determines the prediction capability of the strategy. The uncertainty of λ(dg) causes difficulties in estimating H and the statistical significance, α, of prediction results. We investigate this problem theoretically and show how the uncertainty of the measure can be taken into account in two situations, viz., the estimation of α and the construction of a confidence zone for the (n, τ)-parameters of the random strategies. We use our approach to analyse the results from prediction of M >= 8.0 events by the M8 method for the period 1985-2009 (the M8.0+ test). The model of λ(dg) based on the events Mw >= 5.5, 1977-2004, and the magnitude range of target events 8.0 <= M < 8.5 are considered as basic to this M8 analysis. We find the point and upper estimates of α and show that they are still unstable because the number of target events in the experiment is small. However, our results argue in favour of non-triviality of the M8 prediction algorithm.

  12. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity

    Science.gov (United States)

    Zoeller, G.

    2017-12-01

    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  13. Physical bases of the generation of short-term earthquake precursors: A complex model of ionization-induced geophysical processes in the lithosphere-atmosphere-ionosphere-magnetosphere system

    Science.gov (United States)

    Pulinets, S. A.; Ouzounov, D. P.; Karelin, A. V.; Davidenko, D. V.

    2015-07-01

    This paper describes the current understanding of the interaction between geospheres from a complex set of physical and chemical processes under the influence of ionization. The sources of ionization involve the Earth's natural radioactivity and its intensification before earthquakes in seismically active regions, anthropogenic radioactivity caused by nuclear weapon testing and accidents in nuclear power plants and radioactive waste storage, the impact of galactic and solar cosmic rays, and active geophysical experiments using artificial ionization equipment. This approach treats the environment as an open complex system with dissipation, where inherent processes can be considered in the framework of the synergistic approach. We demonstrate the synergy between the evolution of thermal and electromagnetic anomalies in the Earth's atmosphere, ionosphere, and magnetosphere. This makes it possible to determine the direction of the interaction process, which is especially important in applications related to short-term earthquake prediction. That is why the emphasis in this study is on the processes proceeding the final stage of earthquake preparation; the effects of other ionization sources are used to demonstrate that the model is versatile and broadly applicable in geophysics.

  14. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    Science.gov (United States)

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  15. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  16. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  17. Predicted Liquefaction in the Greater Oakland and Northern Santa Clara Valley Areas for a Repeat of the 1868 Hayward Earthquake

    Science.gov (United States)

    Holzer, T. L.; Noce, T. E.; Bennett, M. J.

    2008-12-01

    Probabilities of surface manifestations of liquefaction due to a repeat of the 1868 (M6.7-7.0) earthquake on the southern segment of the Hayward Fault were calculated for two areas along the margin of San Francisco Bay, California: greater Oakland and the northern Santa Clara Valley. Liquefaction is predicted to be more common in the greater Oakland area than in the northern Santa Clara Valley owing to the presence of 57 km2 of susceptible sandy artificial fill. Most of the fills were placed into San Francisco Bay during the first half of the 20th century to build military bases, port facilities, and shoreline communities like Alameda and Bay Farm Island. Probabilities of liquefaction in the area underlain by this sandy artificial fill range from 0.2 to ~0.5 for a M7.0 earthquake, and decrease to 0.1 to ~0.4 for a M6.7 earthquake. In the greater Oakland area, liquefaction probabilities generally are less than 0.05 for Holocene alluvial fan deposits, which underlie most of the remaining flat-lying urban area. In the northern Santa Clara Valley for a M7.0 earthquake on the Hayward Fault and an assumed water-table depth of 1.5 m (the historically shallowest water level), liquefaction probabilities range from 0.1 to 0.2 along Coyote and Guadalupe Creeks, but are less than 0.05 elsewhere. For a M6.7 earthquake, probabilities are greater than 0.1 along Coyote Creek but decrease along Guadalupe Creek to less than 0.1. Areas with high probabilities in the Santa Clara Valley are underlain by latest Holocene alluvial fan levee deposits where liquefaction and lateral spreading occurred during large earthquakes in 1868 and 1906. The liquefaction scenario maps were created with ArcGIS ModelBuilder. Peak ground accelerations first were computed with the new Boore and Atkinson NGA attenuation relation (2008, Earthquake Spectra, 24:1, p. 99-138), using VS30 to account for local site response. Spatial liquefaction probabilities were then estimated using the predicted ground motions

  18. NGA-West 2 Equations for predicting PGA, PGV, and 5%-Damped PSA for shallow crustal earthquakes

    Science.gov (United States)

    Boore, David M.; Stewart, Jon P.; Seyhan, Emel; Atkinson, Gail M.

    2013-01-01

    We provide ground-motion prediction equations for computing medians and standard deviations of average horizontal component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. We derived equations for the primary M- and distance-dependence of the IMs after fixing the VS30-based nonlinear site term from a parallel NGA-West 2 study. We then evaluated additional effects using mixed effects residuals analysis, which revealed no trends with source depth over the M range of interest, indistinct Class 1 and 2 event IMs, and basin depth effects that increase and decrease long-period IMs for depths larger and smaller, respectively, than means from regional VS30-depth relations. Our aleatory variability model captures decreasing between-event variability with M, as well as within-event variability that increases or decreases with M depending on period, increases with distance, and decreases for soft sites.

  19. Short-term wind power prediction

    DEFF Research Database (Denmark)

    Joensen, Alfred K.

    2003-01-01

    , and to implement these models and methods in an on-line software application. The economical value of having predictions available is also briefly considered. The summary report outlines the background and motivation for developing wind power prediction models. The meteorological theory which is relevant......The present thesis consists of 10 research papers published during the period 1997-2002 together with a summary report. The objective of the work described in the thesis is to develop models and methods for calculation of high accuracy predictions of wind power generated electricity...

  20. Multiparameter monitoring of short-term earthquake precursors and its physical basis. Implementation in the Kamchatka region

    Directory of Open Access Journals (Sweden)

    Pulinets Sergey

    2016-01-01

    Full Text Available We apply experimental approach of the multiparameter monitoring of short-term earthquake precursors which reliability was confirmed by the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC model created recently [1]. A key element of the model is the process of Ion induced Nucleation (IIN and formation of cluster ions occurring as a result of the ionization of near surface air layer by radon emanating from the Earth's crust within the earthquake preparation zone. This process is similar to the formation of droplet’s embryos for cloud formation under action of galactic cosmic rays. The consequence of this process is the generation of a number of precursors that can be divided into two groups: a thermal and meteorological, and b electromagnetic and ionospheric. We demonstrate elements of prospective monitoring of some strong earthquakes in Kamchatka region and statistical results for the Chemical potential correction parameter for more than 10 years of observations for earthquakes with M≥6. As some experimental attempt, the data of Kamchatka volcanoes monitoring will be demonstrated.

  1. The East Aegean Sea strong earthquake sequence of October–November 2005: lessons learned for earthquake prediction from foreshocks

    Directory of Open Access Journals (Sweden)

    G. A. Papadopoulos

    2006-01-01

    Full Text Available The seismic sequence of October–November 2005 in the Samos area, East Aegean Sea, was studied with the aim to show how it is possible to establish criteria for (a the rapid recognition of both the ongoing foreshock activity and the mainshock, and (b the rapid discrimination between the foreshock and aftershock phases of activity. It has been shown that before the mainshock of 20 October 2005, foreshock activity is not recognizable in the standard earthquake catalogue. However, a detailed examination of the records in the SMG station, which is the closest to the activated area, revealed that hundreds of small shocks not listed in the standard catalogue were recorded in the time interval from 12 October 2005 up to 21 November 2005. The production of reliable relations between seismic signal duration and duration magnitude for earthquakes included in the standard catalogue, made it possible to use signal durations in SMG records and to determine duration magnitudes for 2054 small shocks not included in the standard catalogue. In this way a new catalogue with magnitude determination for 3027 events was obtained while the standard catalogue contains 1025 events. At least 55 of them occurred from 12 October 2005 up to the occurrence of the two strong foreshocks of 17 October 2005. This implies that foreshock activity developed a few days before the strong shocks of 17 October 2005 but it escaped recognition by the routine procedure of seismic analysis. The onset of the foreshock phase of activity is recognizable by the significant increase of the mean seismicity rate which increased exponentially with time. According to the least-squares approach the b-value of the magnitude-frequency relation dropped significantly during the foreshock activity with respect to the b-value prevailing in the declustered background seismicity. However, the maximum likelihood approach does not indicate such a drop of b. The b-value found for the aftershocks that

  2. Ground Motion Prediction for Great Interplate Earthquakes in Kanto Basin Considering Variation of Source Parameters

    Science.gov (United States)

    Sekiguchi, H.; Yoshimi, M.; Horikawa, H.

    2011-12-01

    Broadband ground motions are estimated in the Kanto sedimentary basin which holds Tokyo metropolitan area inside for anticipated great interplate earthquakes along surrounding plate boundaries. Possible scenarios of great earthquakes along Sagami trough are modeled combining characteristic properties of the source area and adequate variation in source parameters in order to evaluate possible ground motion variation due to next Kanto earthquake. South to the rupture area of the 2011 Tohoku earthquake along the Japan trench, we consider possible M8 earthquake. The ground motions are computed with a four-step hybrid technique. We first calculate low-frequency ground motions at the engineering basement. We then calculate higher-frequency ground motions at the same position, and combine the lower- and higher-frequency motions using a matched filter. We finally calculate ground motions at the surface by computing the response of the alluvium-diluvium layers to the combined motions at the engineering basement.

  3. The bayesian probabilistic prediction of the next earthquake in the ometepec segment of the mexican subduction zone

    Science.gov (United States)

    Ferraes, Sergio G.

    1992-06-01

    A predictive equation to estimate the next interoccurrence time (τ) for the next earthquake ( M≥6) in the Ometepec segment is presented, based on Bayes' theorem and the Gaussian process. Bayes' theorem is used to relate the Gaussian process to both a log-normal distribution of recurrence times (τ) and a log-normal distribution of magnitudes ( M) ( Nishenko and Buland, 1987; Lomnitz, 1964). We constructed two new random variables X=In M and Y=In τ with normal marginal densities, and based on the Gaussian process model we assume that their joint density is normal. Using this information, we determine the Bayesian conditional probability. Finally, a predictive equation is derived, based on the criterion of maximization of the Bayesian conditional probability. The model forecasts the next interoccurrence time, conditional on the magnitude of the last event. Realistic estimates of future damaging earthquakes are based on relocated historical earthquakes. However, at the present time there is a controversy between Nishenko-Singh and Gonzalez-Ruiz-Mc-Nally concerning the rupturing process of the 1907 earthquake. We use our Bayesian analysis to examine and discuss this very important controversy. To clarify to the full significance of the analysis, we put forward the results using two catalogues: (1) The Ometepec catalogue without the 1907 earthquake (González-Ruíz-McNally), and (2) the Ometepec catalogue including the 1907 earthquake (Nishenko-Singh). The comparison of the prediction error reveals that in the Nishenko-Singh catalogue, the errors are considerably smaller than the average error for the González-Ruíz-McNally catalogue of relocated events. Finally, using the Nishenko-Singh catalogue which locates the 1907 event inside the Ometepec segment, we conclude that the next expected damaging earthquake ( M≥6.0) will occur approximately within the next time interval τ=11.82 years from the last event (which occurred on July 2, 1984), or equivalently will

  4. Long-term effect of early-life stress from earthquake exposure on working memory in adulthood.

    Science.gov (United States)

    Li, Na; Wang, Yumei; Zhao, Xiaochuan; Gao, Yuanyuan; Song, Mei; Yu, Lulu; Wang, Lan; Li, Ning; Chen, Qianqian; Li, Yunpeng; Cai, Jiajia; Wang, Xueyi

    2015-01-01

    The present study aimed to investigate the long-term effect of 1976 Tangshan earthquake exposure in early life on performance of working memory in adulthood. A total of 907 study subjects born and raised in Tangshan were enrolled in this study. They were divided into three groups according to the dates of birth: infant exposure (3-12 months, n=274), prenatal exposure (n=269), and no exposure (born at least 1 year after the earthquake, n=364). The prenatal group was further divided into first, second, and third trimester subgroups based on the timing of exposure during pregnancy. Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised (BVMT-R) were used to measure the performance of working memory. Unconditional logistic regression analysis was used to analyze the influential factors for impaired working memory. The Hopkins Verbal Learning Test-Revised scores did not show significant difference across the three groups. Compared with no exposure group, the BVMT-R scores were slightly lower in the prenatal exposure group and markedly decreased in the infant exposure group. When the BVMT-R scores were analyzed in three subgroups, the results showed that the subjects whose mothers were exposed to earthquake in the second and third trimesters of pregnancy had significantly lower BVMT-R scores compared with those in the first trimester. Education level and early-life earthquake exposure were identified as independent risk factors for reduced performance of visuospatial memory indicated by lower BVMT-R scores. Infant exposure to earthquake-related stress impairs visuospatial memory in adulthood. Fetuses in the middle and late stages of development are more vulnerable to stress-induced damage that consequently results in impaired visuospatial memory. Education and early-life trauma can also influence the performance of working memory in adulthood.

  5. Short-term predictions in forex trading

    Science.gov (United States)

    Muriel, A.

    2004-12-01

    Using a kinetic equation that is used to model turbulence (Physica A, 1985-1988, Physica D, 2001-2003), we redefine variables to model the time evolution of the foreign exchange rates of three major currencies. We display live and predicted data for one period of trading in October, 2003.

  6. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  7. Long-term geomagnetic changes observed in association with earthquake swarm activities in the Izu Peninsula, Japan

    Energy Technology Data Exchange (ETDEWEB)

    Oshiman, N. [Kyoto University Kyoto (Japan). Disaster Prevention Research Institute; Sasai, Y.; Ishikawa, Y.; Koyama, S. [Tokyo Univ., Tokyo (Japan). Earthquake Research Institute; Honkura, Y. [Tokyo Univ., Tokyo (Japan). Dept. of Earth and Planetary Sciences

    2001-04-01

    Anomalous crustal uplift has continued since 1976 in the Izu Peninsula, Japan. Earthquake swarms have also occurred intermittently off the coast of Ito since 1978. Observations of the total intensity of the geomagnetic field in the peninsula started in 1976 to detect anomalous changes in association with those crustal activities. In particular, a dense continuous observation network using proton magnetometers was established in the northeastern part of the peninsula, immediately after the sea-floor eruption off the coast of Ito in 1989. No remarkable swarm activities were observed there from 1990 to 1992. However, after the occurrence of a small swarm in January 1993, five large swarm activities were observed. At some observation sites, it was observed a remarkable long-term trend in the total geomagnetic field in association with the change in the distribution pattern in the seismicity of the earthquake swarms.

  8. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  9. Long term prediction of unconventional oil production

    International Nuclear Information System (INIS)

    Mohr, S.H.; Evans, G.M.

    2010-01-01

    Although considerable discussion surrounds unconventional oil's ability to mitigate the effects of peaking conventional oil production, very few models of unconventional oil production exist. The aim of this article was to project unconventional oil production to determine how significant its production may be. Two models were developed to predict the unconventional oil production, one model for in situ production and the other for mining the resources. Unconventional oil production is anticipated to reach between 18 and 32 Gb/y (49-88 Mb/d) in 2076-2084, before declining. If conventional oil production is at peak production then projected unconventional oil production cannot mitigate peaking of conventional oil alone.

  10. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation

    Science.gov (United States)

    Thomas, J.N.; Masci, F; Love, Jeffrey J.

    2015-01-01

    Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes.

  11. Long-term predictions using natural analogues

    International Nuclear Information System (INIS)

    Ewing, R.C.

    1995-01-01

    One of the unique and scientifically most challenging aspects of nuclear waste isolation is the extrapolation of short-term laboratory data (hours to years) to the long time periods (10 3 -10 5 years) required by regulatory agencies for performance assessment. The direct validation of these extrapolations is not possible, but methods must be developed to demonstrate compliance with government regulations and to satisfy the lay public that there is a demonstrable and reasonable basis for accepting the long-term extrapolations. Natural systems (e.g., open-quotes natural analoguesclose quotes) provide perhaps the only means of partial open-quotes validation,close quotes as well as data that may be used directly in the models that are used in the extrapolation. Natural systems provide data on very large spatial (nm to km) and temporal (10 3 -10 8 years) scales and in highly complex terranes in which unknown synergisms may affect radionuclide migration. This paper reviews the application (and most importantly, the limitations) of data from natural analogue systems to the open-quotes validationclose quotes of performance assessments

  12. Design and Optimization of a Telemetric system for appliance in earthquake prediction

    Science.gov (United States)

    Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C.

    2009-04-01

    This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process

  13. CyberShake-derived ground-motion prediction models for the Los Angeles region with application to earthquake early warning

    Science.gov (United States)

    Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.

    2014-01-01

    Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking in the Los Angeles (LA) basin. These thresholds are used to construct a simple and robust EEW algorithm: to

  14. Development of a technique for long-term detection of precursors of strong earthquakes using high-resolution satellite images

    Science.gov (United States)

    Soto-Pinto, C. A.; Arellano-Baeza, A. A.; Ouzounov, D. P.

    2012-12-01

    Among a variety of processes involved in seismic activity, the principal process is the accumulation and relaxation of stress in the crust, which takes place at the depth of tens of kilometers. While the Earth's surface bears at most the indirect sings of the accumulation and relaxation of the crust stress, it has long been understood that there is a strong correspondence between the structure of the underlying crust and the landscape. We assume the structure of the lineaments reflects an internal structure of the Earth's crust, and the variation of the lineament number and arrangement reflects the changes in the stress patterns related to the seismic activity. Contrary to the existing assumptions that lineament structure changes only at the geological timescale, we have found that the much faster seismic activity strongly affects the system of lineaments extracted from the high-resolution multispectral satellite images. Previous studies have shown that accumulation of the stress in the crust previous to a strong earthquake is directly related to the number increment and preferential orientation of lineament configuration present in the satellite images of epicenter zones. This effect increases with the earthquake magnitude and can be observed approximately since one month before. To study in details this effect we have developed a software based on a series of algorithms for automatic detection of lineaments. It was found that the Hough transform implemented after the application of discontinuity detection mechanisms like Canny edge detector or directional filters is the most robust technique for detection and characterization of changes in the lineament patterns related to strong earthquakes, which can be used as a robust long-term precursor of earthquakes indicating regions of strong stress accumulation.

  15. A new ensemble model for short term wind power prediction

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Razvan-Daniel; Felea, Ioan

    2012-01-01

    As the objective of this study, a non-linear ensemble system is used to develop a new model for predicting wind speed in short-term time scale. Short-term wind power prediction becomes an extremely important field of research for the energy sector. Regardless of the recent advancements in the re-search...... of prediction models, it was observed that different models have different capabilities and also no single model is suitable under all situations. The idea behind EPS (ensemble prediction systems) is to take advantage of the unique features of each subsystem to detain diverse patterns that exist in the dataset...

  16. Long-Term Effects of the 2011 Japan Earthquake and Tsunami on Incidence of Fatal and Nonfatal Myocardial Infarction.

    Science.gov (United States)

    Nakamura, Motoyuki; Tanaka, Kentarou; Tanaka, Fumitaka; Matsuura, Yuuki; Komi, Ryousuke; Niiyama, Masanobu; Kawakami, Mikio; Koeda, Yorihiko; Sakai, Toshiaki; Onoda, Toshiyuki; Itoh, Tomonori

    2017-08-01

    This study aimed to examine the long-term effects of the 2011 Japan earthquake and tsunami on the incidence of fatal and nonfatal myocardial infarction (MI). In the present study, the incidence of 2 types of cardiac events was comprehensively recorded. The study area was divided into 2 zones based on the severity of tsunami damage, which was determined by the percentage of the inundated area within the residential area (tsunami (r = 0.77; p tsunami was associated with a continual increase in the incidence of fatal MI among disaster survivors. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. Debris-flows scale predictions based on basin spatial parameters calculated from Remote Sensing images in Wenchuan earthquake area

    International Nuclear Information System (INIS)

    Zhang, Huaizhen; Chi, Tianhe; Liu, Tianyue; Wang, Wei; Yang, Lina; Zhao, Yuan; Shao, Jing; Yao, Xiaojing; Fan, Jianrong

    2014-01-01

    Debris flow is a common hazard in the Wenchuan earthquake area. Collapse and Landslide Regions (CLR), caused by earthquakes, could be located from Remote Sensing images. CLR are the direct material source regions for debris flow. The Spatial Distribution of Collapse and Landslide Regions (SDCLR) strongly impact debris-flow formation. In order to depict SDCLR, we referred to Strahler's Hypsometric analysis method and developed 3 functional models to depict SDCLR quantitatively. These models mainly depict SDCLR relative to altitude, basin mouth and main gullies of debris flow. We used the integral of functions as the spatial parameters of SDCLR and these parameters were employed during the process of debris-flows scale predictions. Grouping-occurring debris-flows triggered by the rainstorm, which occurred on September 24th 2008 in Beichuan County, Sichuan province China, were selected to build the empirical equations for debris-flows scale predictions. Given the existing data, only debris-flows runout zone parameters (Max. runout distance L and Lateral width B) were estimated in this paper. The results indicate that the predicted results were more accurate when the spatial parameters were used. Accordingly, we suggest spatial parameters of SDCLR should be considered in the process of debris-flows scale prediction and proposed several strategies to prevent debris flow in the future

  18. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  19. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.

    1999-01-01

    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  20. GOPET: A tool for automated predictions of Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Glatting Karl-Heinz

    2006-03-01

    Full Text Available Abstract Background Vast progress in sequencing projects has called for annotation on a large scale. A Number of methods have been developed to address this challenging task. These methods, however, either apply to specific subsets, or their predictions are not formalised, or they do not provide precise confidence values for their predictions. Description We recently established a learning system for automated annotation, trained with a broad variety of different organisms to predict the standardised annotation terms from Gene Ontology (GO. Now, this method has been made available to the public via our web-service GOPET (Gene Ontology term Prediction and Evaluation Tool. It supplies annotation for sequences of any organism. For each predicted term an appropriate confidence value is provided. The basic method had been developed for predicting molecular function GO-terms. It is now expanded to predict biological process terms. This web service is available via http://genius.embnet.dkfz-heidelberg.de/menu/biounit/open-husar Conclusion Our web service gives experimental researchers as well as the bioinformatics community a valuable sequence annotation device. Additionally, GOPET also provides less significant annotation data which may serve as an extended discovery platform for the user.

  1. Predicting the long-term citation impact of recent publications

    NARCIS (Netherlands)

    Stegehuis, Clara; Litvak, Nelli; Waltman, Ludo

    2015-01-01

    A fundamental problem in citation analysis is the prediction of the long-term citation impact of recent publications. We propose a model to predict a probability distribution for the future number of citations of a publication. Two predictors are used: The impact factor of the journal in which a

  2. Predicting the long-term citation impact of recent publications

    NARCIS (Netherlands)

    Stegehuis, Clara; Litvak, Nelli; Waltman, Ludo

    A fundamental problem in citation analysis is the prediction of the long-term citation impact of recent publications. We propose a model to predict a probability distribution for the future number of citations of a publication. Two predictors are used: the impact factor of the journal in which a

  3. VLF/LF Radio Sounding of Ionospheric Perturbations Associated with Earthquakes

    Directory of Open Access Journals (Sweden)

    Masashi Hayakawa

    2007-07-01

    Full Text Available It is recently recognized that the ionosphere is very sensitive to seismic effects,and the detection of ionospheric perturbations associated with earthquakes, seems to bevery promising for short-term earthquake prediction. We have proposed a possible use ofVLF/LF (very low frequency (3-30 kHz /low frequency (30-300 kHz radio sounding ofthe seismo-ionospheric perturbations. A brief history of the use of subionospheric VLF/LFpropagation for the short-term earthquake prediction is given, followed by a significantfinding of ionospheric perturbation for the Kobe earthquake in 1995. After showingprevious VLF/LF results, we present the latest VLF/LF findings; One is the statisticalcorrelation of the ionospheric perturbation with earthquakes and the second is a case studyfor the Sumatra earthquake in December, 2004, indicating the spatical scale and dynamicsof ionospheric perturbation for this earthquake.

  4. AN EFFECTIVE HYBRID SUPPORT VECTOR REGRESSION WITH CHAOS-EMBEDDED BIOGEOGRAPHY-BASED OPTIMIZATION STRATEGY FOR PREDICTION OF EARTHQUAKE-TRIGGERED SLOPE DEFORMATIONS

    Directory of Open Access Journals (Sweden)

    A. A. Heidari

    2015-12-01

    Full Text Available Earthquake can pose earth-shattering health hazards to the natural slops and land infrastructures. One of the chief consequences of the earthquakes can be land sliding, which is instigated by durable shaking. In this research, an efficient procedure is proposed to assist the prediction of earthquake-originated slope displacements (EIDS. New hybrid SVM-CBBO strategy is implemented to predict the EIDS. For this purpose, first, chaos paradigm is combined with initialization of BBO to enhance the diversification and intensification capacity of the conventional BBO optimizer. Then, chaotic BBO is developed as the searching scheme to investigate the best values of SVR parameters. In this paper, it will be confirmed that how the new computing approach is effective in prediction of EIDS. The outcomes affirm that the SVR-BBO strategy with chaos can be employed effectively as a predicting tool for evaluating the EIDS.

  5. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  6. Unusual Animal Behavior Preceding the 2011 Earthquake off the Pacific Coast of Tohoku, Japan: A Way to Predict the Approach of Large Earthquakes

    Directory of Open Access Journals (Sweden)

    Hiroyuki Yamauchi

    2014-04-01

    Full Text Available Unusual animal behaviors (UABs have been observed before large earthquakes (EQs, however, their mechanisms are unclear. While information on UABs has been gathered after many EQs, few studies have focused on the ratio of emerged UABs or specific behaviors prior to EQs. On 11 March 2011, an EQ (Mw 9.0 occurred in Japan, which took about twenty thousand lives together with missing and killed persons. We surveyed UABs of pets preceding this EQ using a questionnaire. Additionally, we explored whether dairy cow milk yields varied before this EQ in particular locations. In the results, 236 of 1,259 dog owners and 115 of 703 cat owners observed UABs in their pets, with restless behavior being the most prominent change in both species. Most UABs occurred within one day of the EQ. The UABs showed a precursory relationship with epicentral distance. Interestingly, cow milk yields in a milking facility within 340 km of the epicenter decreased significantly about one week before the EQ. However, cows in facilities farther away showed no significant decreases. Since both the pets’ behavior and the dairy cows’ milk yields were affected prior to the EQ, with careful observation they could contribute to EQ predictions.

  7. Implications of next generation attenuation ground motion prediction equations for site coefficients used in earthquake resistant design

    Science.gov (United States)

    Borcherdt, Roger D.

    2014-01-01

    Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value  of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein. 

  8. Long-term change of site response after the M W 9.0 Tohoku earthquake in Japan

    Science.gov (United States)

    Wu, Chunquan; Peng, Zhigang

    2012-12-01

    The recent M W 9.0 off the Pacific coast of Tohoku earthquake is the largest recorded earthquake in Japan's history. The Tohoku main shock and its aftershocks generated widespread strong shakings as large as ~3000 Gal along the east coast of Japan. Wu and Peng (2011) found clear drop of resonant frequency of up to 70% during the Tohoku main shock at 6 sites and correlation of resonance (peak) frequency and peak ground acceleration (PGA) during the main shock. Here we follow that study and systematically analyze long-term changes of material properties in the shallow crust from one year before to 5 months after the Tohoku main shock, using seismic data recorded by the Japanese Strong Motion Network KiK-Net. We use sliding window spectral ratios computed from a pair of surface and borehole stations to track the temporal changes in the site response of 6 sites. Our results show two stages of logarithmic recovery after a sharp drop of resonance frequency during the Tohoku main shock. The first stage is a rapid recovery within several hundred seconds to several hours, and the second stage is a slow recovery of more than five months. We also investigate whether the damage caused by the Tohoku main shock could make the near surface layers more susceptible to further damages, but we do not observe clear changes in susceptibility to further damage before and after the Tohoku main shock.

  9. The Effects of a Short-term Cognitive Behavioral Group Intervention on Bam Earthquake Related PTSD Symptoms in Adolescents

    Directory of Open Access Journals (Sweden)

    Fatemeh Naderi

    2009-04-01

    Full Text Available "n "n "nObjective :Post traumatic stress disorder (PTSD may be the first reaction after disasters. Many studies have shown the efficacy of cognitive- behavioral therapy in treatment of post traumatic stress disorder. The main objective of this study is to evaluate the effect of group CBT in adolescent survivors of a large scale disaster (Bam earthquake. "n "nMethods: In a controlled trial, we evaluated the efficacy of a short term method of group cognitive-behavioral therapy in adolescent survivors of Bam earthquake who had PTSD symptoms and compared it with a control group. The adolescents who had severe PTSD or other psychiatric disorders that needed pharmacological interventions were excluded. We evaluated PTSD symptoms using Post traumatic Stress Scale (PSS pre and post intervention and compared them with a control group. "n "nResults: 100 adolescents were included in the study and 15 were excluded during the intervention. The mean age of the participants was 14.6±2.1 years. The mean score of total PTSD symptoms and the symptoms of avoidance was reduced after interventions, and was statistically significant. The mean change of re-experience and hyper arousal symptoms of PTSD were not significant. "n "nConclusion: Psychological debriefing and group cognitive behavioral therapy may be effective in reducing some of the PTSD symptoms.

  10. Experimental evidence on formation of imminent and short-term hydrochemical precursors for earthquakes

    International Nuclear Information System (INIS)

    Du Jianguo; Amita, Kazuhiro; Ohsawa, Shinji; Zhang Youlian; Kang Chunli; Yamada, Makoto

    2010-01-01

    The formation of imminent hydrochemical precursors of earthquakes is investigated by the simulation for water-rock reaction in a brittle aquifer. Sixty-one soaking experiments were carried out with granodiorite and trachyandesite grains of different sizes and three chemically-distinct waters for 6 to 168 h. The experimental data demonstrate that water-rock reaction can result in both measurable increases and decreases of ion concentrations in short times and that the extents of hydrochemical variations are controlled by the grain size, dissolution and secondary mineral precipitation, as well as the chemistry of the rock and groundwater. The results indicate that water-rock reactions in brittle aquifers and aquitards may be an important genetic mechanism of hydrochemical seismic precursors when the aquifers and aquitards are fractured in response to tectonic stress.

  11. Human cervicovaginal fluid biomarkers to predict term and preterm labor

    Science.gov (United States)

    Heng, Yujing J.; Liong, Stella; Permezel, Michael; Rice, Gregory E.; Di Quinzio, Megan K. W.; Georgiou, Harry M.

    2015-01-01

    Preterm birth (PTB; birth before 37 completed weeks of gestation) remains the major cause of neonatal morbidity and mortality. The current generation of biomarkers predictive of PTB have limited utility. In pregnancy, the human cervicovaginal fluid (CVF) proteome is a reflection of the local biochemical milieu and is influenced by the physical changes occurring in the vagina, cervix and adjacent overlying fetal membranes. Term and preterm labor (PTL) share common pathways of cervical ripening, myometrial activation and fetal membranes rupture leading to birth. We therefore hypothesize that CVF biomarkers predictive of labor may be similar in both the term and preterm labor setting. In this review, we summarize some of the existing published literature as well as our team's breadth of work utilizing the CVF for the discovery and validation of putative CVF biomarkers predictive of human labor. Our team established an efficient method for collecting serial CVF samples for optimal 2-dimensional gel electrophoresis resolution and analysis. We first embarked on CVF biomarker discovery for the prediction of spontaneous onset of term labor using 2D-electrophoresis and solution array multiple analyte profiling. 2D-electrophoretic analyses were subsequently performed on CVF samples associated with PTB. Several proteins have been successfully validated and demonstrate that these biomarkers are associated with term and PTL and may be predictive of both term and PTL. In addition, the measurement of these putative biomarkers was found to be robust to the influences of vaginal microflora and/or semen. The future development of a multiple biomarker bed-side test would help improve the prediction of PTB and the clinical management of patients. PMID:26029118

  12. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  13. LONG-TERM CULTURAL IMPACTS OF DISASTER DECISION-MAKING: The Case of Post Earthquake Reconstruction in Marathwada, India

    Directory of Open Access Journals (Sweden)

    Rohit Jigyasu

    2013-11-01

    Full Text Available Emergency situations are special since they present decision makers with a context that is characterized by extraordinary constraints on resources, need for urgency of actions and a critical psychosocial state that is markedly different than the normal situation. However, actions taken under these extraordinary situations can have a profound bearing on the longterm recovery of the community and its heritage. This paper considers the critical aspects of decision-making in emergency situations that need to be considered for sustainable longterm recovery of cultural heritage. It is difficult however to judge these essential considerations beforehand without evaluating the impacts of these decisions in hindsight. These considerations will be illustrated through case study of post-earthquake reconstruction in Marathwada in India by assessing the long-term impact of rehabilitation policies formulated in the immediate aftermath of the earthquake. Patterns of adaptation and change in these areas demonstrate how small decisions taken during emergency can have wider socio-economic and physical implications. These cases will also show the importance of understanding the local context, especially with respect to local vulnerabilities as well as capacities, skills and resources while making decisions. These would also emphasize the necessity and ways of engaging various stakeholders, especially the local community, not as passive recipients but as important actors in the decision-making process. These considerations are significant for conservation professionals making decisions during emergencies, especially with regards to immediate protection, repairs and long-term recovery of cultural heritage, while we largely remain at the periphery of the reconstruction process.

  14. Long-term seismic observations along Myanmar-Sunda subduction margin: insights for 2004 M w > 9.0 earthquake

    Science.gov (United States)

    Khan, Prosanta Kumar; Banerjee, Jayashree; Shamim, Sk; Mohanty, Manoranjan

    2018-03-01

    The present study investigates the temporal variation of few seismic parameters between the Myanmar (Zone I), Andaman-Nicobar-Northwest Sumatra (Zone II), Southeast Sumatra-West Indonesia (Zone III) and East Indonesia (Zone IV) converging boundaries in reference to the generation of 26 December 2004 M w > 9.0 off-Sumatra mega-earthquake event. The four segments are distinguished based on tectonics parameters, distinct geological locations, great earthquake occurrences, and the Wadati-Benioff zone characteristics. Two important seismic parameters such as seismic energy and b values are computed over a time-window of 6-month period during the entire 1976-2013 period for these segments. The b values show a constant decrease in Zones II, III, and IV, whereas the Zone I does not show any such pattern prior to the 2004 mega-event. The release of seismic energy was also gradually decreasing in Zones II and III till the 2004 event, and little similar pattern was also noted in Zone IV. This distinct observation might be indicating that the stress accumulation was dominant near the Sumatra-Java area located towards southeast of Zone II and northwest of Zone III. The released strain energy during the 2004 event was subsequently migrated towards north, rupturing 1300 km of the boundary between the Northwest Sumatra and the North Andaman. The occurrence of 2004 mega-event was apparently concealed behind the long-term seismic quiescence existing near the Sumatra and Nicobar margin. A systematic study of the patterns of seismic energy release and b values, and the long-term observation of collective behaviour of the margin tectonics might have had given clues to the possibility of the 2004 mega-event.

  15. Scalable data-driven short-term traffic prediction

    NARCIS (Netherlands)

    Friso, K.; Wismans, L. J.J.; Tijink, M. B.

    2017-01-01

    Short-term traffic prediction has a lot of potential for traffic management. However, most research has traditionally focused on either traffic models-which do not scale very well to large networks, computationally-or on data-driven methods for freeways, leaving out urban arterials completely. Urban

  16. The Iquique earthquake sequence of April 2014: Bayesian modeling accounting for prediction uncertainty

    Science.gov (United States)

    Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Riel, Bryan; Owen, Susan E; Moore, Angelyn W; Samsonov, Sergey V; Ortega Culaciati, Francisco; Minson, Sarah E.

    2016-01-01

    The subduction zone in northern Chile is a well-identified seismic gap that last ruptured in 1877. On 1 April 2014, this region was struck by a large earthquake following a two week long series of foreshocks. This study combines a wide range of observations, including geodetic, tsunami, and seismic data, to produce a reliable kinematic slip model of the Mw=8.1 main shock and a static slip model of the Mw=7.7 aftershock. We use a novel Bayesian modeling approach that accounts for uncertainty in the Green's functions, both static and dynamic, while avoiding nonphysical regularization. The results reveal a sharp slip zone, more compact than previously thought, located downdip of the foreshock sequence and updip of high-frequency sources inferred by back-projection analysis. Both the main shock and the Mw=7.7 aftershock did not rupture to the trench and left most of the seismic gap unbroken, leaving the possibility of a future large earthquake in the region.

  17. Controls on the long term earthquake behavior of an intraplate fault revealed by U-Th and stable isotope analyses of syntectonic calcite veins

    Science.gov (United States)

    Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter

    2017-04-01

    U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure

  18. Short-term volcano-tectonic earthquake forecasts based on a moving mean recurrence time algorithm: the El Hierro seismo-volcanic crisis experience

    Science.gov (United States)

    García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón

    2016-05-01

    Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.

  19. Measuring the effectiveness of earthquake forecasting in insurance strategies

    Science.gov (United States)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  20. Summary of the GK15 ground‐motion prediction equation for horizontal PGA and 5% damped PSA from shallow crustal continental earthquakes

    Science.gov (United States)

    Graizer, Vladimir;; Kalkan, Erol

    2016-01-01

    We present a revised ground‐motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration (PGA) and 5% damped pseudospectral acceleration (PSA) response ordinates of the horizontal component of randomly oriented ground motions to be used for seismic‐hazard analyses and engineering applications. This GMPE is derived from the expanded Next Generation Attenuation (NGA)‐West 1 database (see Data and Resources; Chiou et al., 2008). The revised model includes an anelastic attenuation term as a function of quality factor (Q0) to capture regional differences in far‐source (beyond 150 km) attenuation, and a new frequency‐dependent sedimentary‐basin scaling term as a function of depth to the 1.5  km/s shear‐wave velocity isosurface to improve ground‐motion predictions at sites located on deep sedimentary basins. The new Graizer–Kalkan 2015 (GK15) model, developed to be simple, is applicable for the western United States and other similar shallow crustal continental regions in active tectonic environments for earthquakes with moment magnitudes (M) 5.0–8.0, distances 0–250 km, average shear‐wave velocities in the upper 30 m (VS30) 200–1300  m/s, and spectral periods (T) 0.01–5 s. Our aleatory variability model captures interevent (between‐event) variability, which decreases with magnitude and increases with distance. The mixed‐effect residuals analysis reveals that the GK15 has no trend with respect to the independent predictor parameters. Compared to our 2007–2009 GMPE, the PGA values are very similar, whereas spectral ordinates predicted are larger at T<0.2  s and they are smaller at longer periods.

  1. Improving Transit Predictions of Known Exoplanets with TERMS

    Directory of Open Access Journals (Sweden)

    Mahadevan S.

    2011-02-01

    Full Text Available Transiting planet discoveries have largely been restricted to the short-period or low-periastron distance regimes due to the bias inherent in the geometric transit probability. Through the refinement of planetary orbital parameters, and hence reducing the size of transit windows, long-period planets become feasible targets for photometric follow-up. Here we describe the TERMS project that is monitoring these host stars at predicted transit times.

  2. Differential maps, difference maps, interpolated maps, and long term prediction

    International Nuclear Information System (INIS)

    Talman, R.

    1988-06-01

    Mapping techniques may be thought to be attractive for the long term prediction of motion in accelerators, especially because a simple map can approximately represent an arbitrarily complicated lattice. The intention of this paper is to develop prejudices as to the validity of such methods by applying them to a simple, exactly solveable, example. It is shown that a numerical interpolation map, such as can be generated in the accelerator tracking program TEAPOT, predicts the evolution more accurately than an analytically derived differential map of the same order. Even so, in the presence of ''appreciable'' nonlinearity, it is shown to be impractical to achieve ''accurate'' prediction beyond some hundreds of cycles of oscillation. This suggests that the value of nonlinear maps is restricted to the parameterization of only the ''leading'' deviation from linearity. 41 refs., 6 figs

  3. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  4. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Directory of Open Access Journals (Sweden)

    Henny Rydberg

    Full Text Available Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity.We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001.Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  5. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  6. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  7. Long-term associative learning predicts verbal short-term memory performance.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2018-02-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term memory system separate from long-term knowledge. Using natural language corpora, we show experimentally and computationally that performance on three widely used measures of short-term memory (digit span, nonword repetition, and sentence recall) can be predicted from simple associative learning operating on the linguistic environment to which a typical child may have been exposed. The findings support the broad view that short-term verbal memory performance reflects the application of long-term language knowledge to the experimental setting.

  8. Continuous borehole strain and pore pressure in the near field of the 28 September 2004 M 6.0 parkfield, California, earthquake: Implications for nucleation, fault response, earthquake prediction and tremor

    Science.gov (United States)

    Johnston, M.J.S.; Borcherdt, R.D.; Linde, A.T.; Gladwin, M.T.

    2006-01-01

    Near-field observations of high-precision borehole strain and pore pressure, show no indication of coherent accelerating strain or pore pressure during the weeks to seconds before the 28 September 2004 M 6.0 Parkfield earthquake. Minor changes in strain rate did occur at a few sites during the last 24 hr before the earthquake but these changes are neither significant nor have the form expected for strain during slip coalescence initiating fault failure. Seconds before the event, strain is stable at the 10-11 level. Final prerupture nucleation slip in the hypocentral region is constrained to have a moment less than 2 ?? 1012 N m (M 2.2) and a source size less than 30 m. Ground displacement data indicate similar constraints. Localized rupture nucleation and runaway precludes useful prediction of damaging earthquakes. Coseismic dynamic strains of about 10 microstrain peak-to-peak were superimposed on volumetric strain offsets of about 0.5 microstrain to the northwest of the epicenter and about 0.2 microstrain to the southeast of the epicenter, consistent with right lateral slip. Observed strain and Global Positioning System (GPS) offsets can be simply fit with 20 cm of slip between 4 and 10 km on a 20-km segment of the fault north of Gold Hill (M0 = 7 ?? 1017 N m). Variable slip inversion models using GPS data and seismic data indicate similar moments. Observed postseismic strain is 60% to 300% of the coseismic strain, indicating incomplete release of accumulated strain. No measurable change in fault zone compliance preceding or following the earthquake is indicated by stable earth tidal response. No indications of strain change accompany nonvolcanic tremor events reported prior to and following the earthquake.

  9. Surface rupturing earthquakes repeated in the 300 years along the ISTL active fault system, central Japan

    Science.gov (United States)

    Katsube, Aya; Kondo, Hisao; Kurosawa, Hideki

    2017-06-01

    Surface rupturing earthquakes produced by intraplate active faults generally have long recurrence intervals of a few thousands to tens of thousands of years. We here report the first evidence for an extremely short recurrence interval of 300 years for surface rupturing earthquakes on an intraplate system in Japan. The Kamishiro fault of the Itoigawa-Shizuoka Tectonic Line (ISTL) active fault system generated a Mw 6.2 earthquake in 2014. A paleoseismic trench excavation across the 2014 surface rupture showed the evidence for the 2014 event and two prior paleoearthquakes. The slip of the penultimate earthquake was similar to that of 2014 earthquake, and its timing was constrained to be after A.D. 1645. Judging from the timing, the damaged area, and the amount of slip, the penultimate earthquake most probably corresponds to a historical earthquake in A.D. 1714. The recurrence interval of the two most recent earthquakes is thus extremely short compared with intervals on other active faults known globally. Furthermore, the slip repetition during the last three earthquakes is in accordance with the time-predictable recurrence model rather than the characteristic earthquake model. In addition, the spatial extent of the 2014 surface rupture accords with the distribution of a serpentinite block, suggesting that the relatively low coefficient of friction may account for the unusually frequent earthquakes. These findings would affect long-term forecast of earthquake probability and seismic hazard assessment on active faults.

  10. The profound reach of the 11 April 2012 M 8.6 Indian Ocean earthquake: Short‐term global triggering followed by a longer‐term global shadow

    Science.gov (United States)

    Pollitz, Fred; Burgmann, Roland; Stein, Ross S.; Sevilgen, Volkan

    2014-01-01

    The 11 April 2012 M 8.6 Indian Ocean earthquake was an unusually large intraoceanic strike‐slip event. For several days, the global M≥4.5 and M≥6.5 seismicity rate at remote distances (i.e., thousands of kilometers from the mainshock) was elevated. The strike‐slip mainshock appears through its Love waves to have triggered a global burst of strike‐slip aftershocks over several days. But the M≥6.5 rate subsequently dropped to zero for the succeeding 95 days, although the M≤6.0 global rate was close to background during this period. Such an extended period without an M≥6.5 event has happened rarely over the past century, and never after a large mainshock. Quiescent periods following previous large (M≥8) mainshocks over the past century are either much shorter or begin so long after a given mainshock that no physical interpretation is warranted. The 2012 mainshock is unique in terms of both the short‐lived global increase and subsequent long quiescent period. We believe that the two components are linked and interpret this pattern as the product of dynamic stressing of a global system of faults. Transient dynamic stresses can encourage short‐term triggering, but, paradoxically, it can also inhibit rupture temporarily until background tectonic loading restores the system to its premainshock stress levels.

  11. Can mine tremors be predicted? Observational studies of earthquake nucleation, triggering and rupture in South African mines

    CSIR Research Space (South Africa)

    Durrheim, RJ

    2012-05-01

    Full Text Available Earthquakes, and the tsunamis and landslides they trigger, pose a serious risk to people living close to plate boundaries, and a lesser but still significant risk to inhabitants of stable continental regions where destructive earthquakes are rare... of experiments that seek to identify reliable precursors of damaging seismic events. 1. Introduction Earthquakes, and the tsunamis and landslides they trigger, pose a serious risk to people living close to plate boundaries, and a lesser but still significant...

  12. Prediction of long-term behaviour for nuclear waste disposal

    International Nuclear Information System (INIS)

    Shoesmith, D.W.; Ikeda, B.M.; King, F.; Sunder, S.

    1996-09-01

    The modelling procedures developed for the long-term prediction of the corrosion of used fuel and of titanium and copper nuclear waste containers are described. The corrosion behaviour of these materials changes with time as the conditions within the conceptual disposal vault evolve from an early warm, oxidizing phase to an indefinite period of cool, anoxic conditions. For the two candidate container materials, this evolution of conditions means that the containers will be initially susceptible to localized corrosion but that in the long-term, corrosion should be more general in nature. The propagation of the pitting of Cu and of the crevice corrosion of Ti alloys is modelled using statistical models. General corrosion processes are modelled deterministically. For the fuel, deterministic electrochemical models have been developed to predict the long-term dissolution rate of U0 2 . The corrosion behaviour of materials in the disposal vault can be influenced by reengineering the vault environment. For instance, increasing the areal loading of containers will produce higher vault temperatures resulting in more extensive drying of the porous backfill materials. The initiation of crevice corrosion on Ti may then be delayed, leading to longer container lifetimes. For copper containers, minimizing the amount Of O 2 initially trapped in the pores of the backfill, or adding reducing agents to consume this O 2 faster, will limit the extent of corrosion, permitting a reduction of the container wall thickness necessary for containment. (author). 55 refs., 19 figs

  13. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  14. Long-Term Prediction of Satellite Orbit Using Analytical Method

    Directory of Open Access Journals (Sweden)

    Jae-Cheol Yoon

    1997-12-01

    Full Text Available A long-term prediction algorithm of geostationary orbit was developed using the analytical method. The perturbation force models include geopotential upto fifth order and degree and luni-solar gravitation, and solar radiation pressure. All of the perturbation effects were analyzed by secular variations, short-period variations, and long-period variations for equinoctial elements such as the semi-major axis, eccentricity vector, inclination vector, and mean longitude of the satellite. Result of the analytical orbit propagator was compared with that of the cowell orbit propagator for the KOREASAT. The comparison indicated that the analytical solution could predict the semi-major axis with an accuarcy of better than ~35meters over a period of 3 month.

  15. Probabilistic source term predictions for use with decision support systems

    International Nuclear Information System (INIS)

    Grindon, E.; Kinniburgh, C.G.

    2003-01-01

    Full text: Decision Support Systems for use in off-site emergency management, following an incident at a Nuclear Power Plant (NPP) within Europe, are becoming accepted as a useful and appropriate tool to aid decision makers. An area which is not so well developed is the 'upstream' prediction of the source term released into the environment. Rapid prediction of this source term is crucial to the appropriate early management of a nuclear emergency. The initial source term prediction would today be typically based on simple tabulations taking little, or no, account of plant status. It is the interface between the inward looking plant control room team and the outward looking off-site emergency management team that needs to be addressed. This is not an easy proposition as these two distinct disciplines have little common basis from which to communicate their immediate findings and concerns. Within the Euratom Fifth Framework Programme (FP5), complementary approaches are being developed to the pre-release stage; each based on software tools to help bridge this gap. Traditionally source terms (or releases into the environment) provided for use with Decision Support Systems are estimated on a deterministic basis. These approaches use a single, deterministic assumption about plant status. The associated source term represents the 'best estimate' based an available information. No information is provided an the potential for uncertainty in the source term estimate. Using probabilistic methods the outcome is typically a number of possible plant states each with an associated source term and probability. These represent both the best estimate and the spread of the likely source term. However, this is a novel approach and the usefulness of such source term prediction tools is yet to be tested on a wide scale. The benefits of probabilistic source term estimation are presented here; using, as an example, the SPRINT tool developed within the FP5 STERPS project. System for the

  16. NGA-West2 equations for predicting vertical-component PGA, PGV, and 5%-damped PSA from shallow crustal earthquakes

    Science.gov (United States)

    Stewart, Jonathan P.; Boore, David M.; Seyhan, Emel; Atkinson, Gail M.

    2016-01-01

    We present ground motion prediction equations (GMPEs) for computing natural log means and standard deviations of vertical-component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. The functions are similar to those for our horizontal GMPEs. We derive equations for the primary M- and distance-dependence of peak acceleration, peak velocity, and 5%-damped pseudo-spectral accelerations at oscillator periods between 0.01–10 s. We observe pronounced M-dependent geometric spreading and region-dependent anelastic attenuation for high-frequency IMs. We do not observe significant region-dependence in site amplification. Aleatory uncertainty is found to decrease with increasing magnitude; within-event variability is independent of distance. Compared to our horizontal-component GMPEs, attenuation rates are broadly comparable (somewhat slower geometric spreading, faster apparent anelastic attenuation), VS30-scaling is reduced, nonlinear site response is much weaker, within-event variability is comparable, and between-event variability is greater.

  17. Report by the 'Mega-earthquakes and mega-tsunamis' subgroup

    International Nuclear Information System (INIS)

    Friedel, Jacques; Courtillot, Vincent; Dercourt, Jean; Jaupart, Claude; Le Pichon, Xavier; Poirier, Jean-Paul; Salencon, Jean; Tapponnier, Paul; Dautray, Robert; Carpentier, Alain; Taquet, Philippe; Blanchet, Rene; Le Mouel, Jean-Louis; BARD, Pierre-Yves; Bernard, Pascal; Montagner, Jean-Paul; Armijo, Rolando; Shapiro, Nikolai; Tait, Steve; Cara, Michel; Madariaga, Raul; Pecker, Alain; Schindele, Francois; Douglas, John

    2011-06-01

    This report comprises a presentation of scientific data on subduction earthquakes, on tsunamis and on the Tohoku earthquake. It proposes a detailed description of the French situation (in the West Indies, in metropolitan France, and in terms of soil response), and a discussion of social and economic issues (governance, seismic regulation and nuclear safety, para-seismic protection of constructions). The report is completed by other large documents: presentation of data on the Japanese earthquake, discussion on prediction and governance errors in the management of earthquake mitigation in Japan, discussions on tsunami prevention, on needs of research on accelerometers, and on the seismic risk in France

  18. Quantifying capability of a local seismic network in terms of locations and focal mechanism solutions of weak earthquakes

    Science.gov (United States)

    Fojtíková, Lucia; Kristeková, Miriam; Málek, Jiří; Sokos, Efthimios; Csicsay, Kristián; Zahradník, Jiří

    2016-01-01

    Extension of permanent seismic networks is usually governed by a number of technical, economic, logistic, and other factors. Planned upgrade of the network can be justified by theoretical assessment of the network capability in terms of reliable estimation of the key earthquake parameters (e.g., location and focal mechanisms). It could be useful not only for scientific purposes but also as a concrete proof during the process of acquisition of the funding needed for upgrade and operation of the network. Moreover, the theoretical assessment can also identify the configuration where no improvement can be achieved with additional stations, establishing a tradeoff between the improvement and additional expenses. This paper presents suggestion of a combination of suitable methods and their application to the Little Carpathians local seismic network (Slovakia, Central Europe) monitoring epicentral zone important from the point of seismic hazard. Three configurations of the network are considered: 13 stations existing before 2011, 3 stations already added in 2011, and 7 new planned stations. Theoretical errors of the relative location are estimated by a new method, specifically developed in this paper. The resolvability of focal mechanisms determined by waveform inversion is analyzed by a recent approach based on 6D moment-tensor error ellipsoids. We consider potential seismic events situated anywhere in the studied region, thus enabling "mapping" of the expected errors. Results clearly demonstrate that the network extension remarkably decreases the errors, mainly in the planned 23-station configuration. The already made three-station extension of the network in 2011 allowed for a few real data examples. Free software made available by the authors enables similar application in any other existing or planned networks.

  19. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

    Science.gov (United States)

    Paluska, A.; Pavoni, N.

    1983-01-01

    Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

  20. Human short-term spatial memory: precision predicts capacity.

    Science.gov (United States)

    Banta Lavenex, Pamela; Boujon, Valérie; Ndarugendamwo, Angélique; Lavenex, Pierre

    2015-03-01

    Here, we aimed to determine the capacity of human short-term memory for allocentric spatial information in a real-world setting. Young adults were tested on their ability to learn, on a trial-unique basis, and remember over a 1-min interval the location(s) of 1, 3, 5, or 7 illuminating pads, among 23 pads distributed in a 4m×4m arena surrounded by curtains on three sides. Participants had to walk to and touch the pads with their foot to illuminate the goal locations. In contrast to the predictions from classical slot models of working memory capacity limited to a fixed number of items, i.e., Miller's magical number 7 or Cowan's magical number 4, we found that the number of visited locations to find the goals was consistently about 1.6 times the number of goals, whereas the number of correct choices before erring and the number of errorless trials varied with memory load even when memory load was below the hypothetical memory capacity. In contrast to resource models of visual working memory, we found no evidence that memory resources were evenly distributed among unlimited numbers of items to be remembered. Instead, we found that memory for even one individual location was imprecise, and that memory performance for one location could be used to predict memory performance for multiple locations. Our findings are consistent with a theoretical model suggesting that the precision of the memory for individual locations might determine the capacity of human short-term memory for spatial information. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Earthquake chemical precursors in groundwater: a review

    Science.gov (United States)

    Paudel, Shukra Raj; Banjara, Sushant Prasad; Wagle, Amrita; Freund, Friedemann T.

    2018-03-01

    We review changes in groundwater chemistry as precursory signs for earthquakes. In particular, we discuss pH, total dissolved solids (TDS), electrical conductivity, and dissolved gases in relation to their significance for earthquake prediction or forecasting. These parameters are widely believed to vary in response to seismic and pre-seismic activity. However, the same parameters also vary in response to non-seismic processes. The inability to reliably distinguish between changes caused by seismic or pre-seismic activities from changes caused by non-seismic activities has impeded progress in earthquake science. Short-term earthquake prediction is unlikely to be achieved, however, by pH, TDS, electrical conductivity, and dissolved gas measurements alone. On the other hand, the production of free hydroxyl radicals (•OH), subsequent reactions such as formation of H2O2 and oxidation of As(III) to As(V) in groundwater, have distinctive precursory characteristics. This study deviates from the prevailing mechanical mantra. It addresses earthquake-related non-seismic mechanisms, but focused on the stress-induced electrification of rocks, the generation of positive hole charge carriers and their long-distance propagation through the rock column, plus on electrochemical processes at the rock-water interface.

  2. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  3. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  4. Interpretation of changes in water level accompanying fault creep and implications for earthquake prediction.

    Science.gov (United States)

    Wesson, R.L.

    1981-01-01

    Quantitative calculations for the effect of a fault creep event on observations of changes in water level in wells provide an approach to the tectonic interpretation of these phenomena. For the pore pressure field associated with an idealized creep event having an exponential displacement versus time curve, an analytic expression has been obtained in terms of exponential-integral functions. The pore pressure versus time curves for observation points near the fault are pulselike; a sharp pressure increase (or decrease, depending on the direction of propagation) is followed by more gradual decay to the normal level after the creep event. The time function of the water level change may be obtained by applying the filter - derived by A.G.Johnson and others to determine the influence of atmospheric pressure on water level - to the analytic pore pressure versus time curves. The resulting water level curves show a fairly rapid increase (or decrease) and then a very gradual return to normal. The results of this analytic model do not reproduce the steplike changes in water level observed by Johnson and others. If the procedure used to obtain the water level from the pore pressure is correct, these results suggest that steplike changes in water level are not produced by smoothly propagating creep events but by creep events that propagate discontinuously, by changes in the bulk properties of the region around the well, or by some other mechanism.-Author

  5. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  6. Quantifying capability of a local seismic network in terms of locations and focal mechanism solutions of weak earthquakes

    Czech Academy of Sciences Publication Activity Database

    Fojtíková, Lucia; Kristeková, M.; Málek, Jiří; Sokos, E.; Csicsay, K.; Zahradník, J.

    2016-01-01

    Roč. 20, č. 1 (2016), 93-106 ISSN 1383-4649 R&D Projects: GA ČR GAP210/12/2336 Institutional support: RVO:67985891 Keywords : Focal-mechanism uncertainty * Little Carpathians * Relative location uncertainty * Seismic network * Uncertainty mapping * Waveform inversion * Weak earthquake s Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.089, year: 2016

  7. Predicting the long term stabilisation of uranium mill tailings

    International Nuclear Information System (INIS)

    Trojacek, J.

    2004-01-01

    The long-term stabilization and remediation of uranium mill tailings ponds is an important task for DIAMO. After uranium mining was stopped, DIAMO has to remediate more then 400 ha of tailings ponds at three locations. It is currently planned to cover the surface with low permeability materials with a slope of approx. 3% to protect the interior of the disposal facility from infiltrating rainwater. This entails to cover the free surface of these ponds with several hundred thousand tons of inert material. As a result of this load, the porewater from the tailings is expelled and the body of the impounded materials consolidates. Consolidation of tailings proceeds irregularly, depending on the internal structure of the tailings layers and on the progress of loading. The surface needs to be recontoured for a long time into the future. The topic of the DIAMO project is to predict and optimise the final surface contour of the tailings pond body, and to determine the time schedule and locations for recontouring work. The K1 tailings pond in Dolni Rozinka (Southern Moravia) is a typical example for such task. The average thickness of the tailings layer is around 25 m and the average porewater contents varies from 25 up to 40%. In the years 1998-99 a PHARE pilot project was undertaken that aimed to predict the quantity and quality of drainage and infiltration waters as a function of time. A new investigation programme (field, laboratory and modelling) has been implemented. The range of material properties and distribution of types of tailings was established. Orientation calculations of the tailings consolidation were made for fine slime zone. The results have shown that significant subsidence of the surface is to be expected after loading with inert material for the construction of an interim cover. (author)

  8. Prediction of short-term and long-term VOC emissions from SBR bitumen-backed carpet under different temperatures

    NARCIS (Netherlands)

    Yang, X.; Chen, Q.; Bluyssen, P.M.

    1998-01-01

    This paper presents two models for volatile organic compound (VOC) emissions from carpet. One is a numerical model using the computational fluid dynamics (CFD) tech-nique for short-term predictions, the other an analytical model for long-term predictions. The numerical model can (1) deal with

  9. Earthquake precursory events around epicenters and local active faults

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S. B.; Haydari Azad, F.

    2013-05-01

    The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past

  10. Methods for prediction of strong earthquake ground motion. Final technical report, October 1, 1976--September 30, 1977

    International Nuclear Information System (INIS)

    Trifunac, M.D.

    1977-09-01

    The purpose of this report is to summarize the results of the work on characterization of strong earthquake ground motion. The objective of this effort has been to initiate presentation of simple yet detailed methodology for characterization of strong earthquake ground motion for use in licensing and evaluation of operating Nuclear Power Plants. This report will emphasize the simplicity of the methodology by presenting only the end results in a format that may be useful for the development of the site specific criteria in seismic risk analysis, for work on the development of modern standards and regulatory guides, and for re-evaluation of the existing power plant sites

  11. Relaxation creep model of impending earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Morgounov, V. A. [Russian Academy of Sciences, Institute of Physics of the Earth, Moscow (Russian Federation)

    2001-04-01

    The alternative view of the current status and perspective of seismic prediction studies is discussed. In the problem of the ascertainment of the uncertainty relation Cognoscibility-Unpredictability of Earthquakes, priorities of works on short-term earthquake prediction are defined due to the advantage that the final stage of nucleation of earthquake is characterized by a substantial activation of the process while its strain rate increases by the orders of magnitude and considerably increased signal-to-noise ratio. Based on the creep phenomenon under stress relaxation conditions, a model is proposed to explain different images of precursors of impending tectonic earthquakes. The onset of tertiary creep appears to correspond to the onset of instability and inevitably fails unless it unloaded. At this stage, the process acquires the self-regulating character to the greatest extent the property of irreversibility, one of the important components of prediction reliability. Data in situ suggest a principal possibility to diagnose the process of preparation by ground measurements of acoustic and electromagnetic emission in the rocks under constant strain in the condition of self-relaxed stress until the moment of fracture are discussed in context. It was obtained that electromagnetic emission precedes but does not accompany the phase of macrocrak development.

  12. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  13. Long-term impacts of tropical storms and earthquakes on human population growth in Haiti and the Dominican Republic

    OpenAIRE

    Christian D. Klose; Christian Webersik

    2011-01-01

    Since the 18th century, Haiti and the Dominican Republic have experienced similar natural forces, including earthquakes and tropical storms. These countries are two of the most prone of all Latin American and Caribbean countries to natural hazards events, while Haiti seems to be more vulnerable to natural forces. This article discusses to what extent geohazards have shaped both nation's demographic developments. The data show that neither atmospheric nor seismic forces that directly hit ...

  14. Long-term impacts of tropical storms and earthquakes on human population growth in Haiti and Dominican Republic

    OpenAIRE

    Christian D. Klose; Christian Webersik

    2010-01-01

    The two Caribbean states, Haiti and the Dominican Republic, have experienced similar natural forces since the 18th century, including hurricanes and earthquakes. Although, both countries seem to be two of the most prone of all Latin American and Caribbean countries to natural hazard events, historically, Haiti tends to be more vulnerable to natural forces. The purpose of this article is to understand to what extent geohazards shape demographic changes. Research findings of this study show tha...

  15. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  16. Statistical aspects and risks of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2013-12-01

    The seismological community invests ample human capital and financial resources to research and predict risks associated with earthquakes. Industries such as the insurance and re-insurance sector are equally interested in using probabilistic risk models developed by the scientific community to transfer risks. These models are used to predict expected losses due to naturally occurring earthquakes. But what about the risks associated with human-caused earthquakes? Such risk models are largely absent from both industry and academic discourse. In countries around the world, informed citizens are becoming increasingly aware and concerned that this economic bias is not sustainable for long-term economic growth, environmental and human security. Ultimately, citizens look to their government officials to hold industry accountable. In the Netherlands, for example, the hydrocarbon industry is held accountable for causing earthquakes near Groningen. In Switzerland, geothermal power plants were shut down or suspended because they caused earthquakes in canton Basel and St. Gallen. The public and the private non-extractive industry needs access to information about earthquake risks in connection with sub/urban geoengineeing activities, including natural gas production through fracking, geothermal energy production, carbon sequestration, mining and water irrigation. This presentation illuminates statistical aspects of human-caused earthquakes with respect to different geologic environments. Statistical findings are based on the first catalog of human-caused earthquakes (in Klose 2013). Findings are discussed which include the odds to die during a medium-size earthquake that is set off by geomechanical pollution. Any kind of geoengineering activity causes this type of pollution and increases the likelihood of triggering nearby faults to rupture.

  17. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  18. Short-term changes in arterial inflammation predict long-term changes in atherosclerosis progression

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Philip [Massachusetts General Hospital and Harvard Medical School, Cardiology Division and Cardiac MR PET CT Program, Boston, MA (United States); McMaster University, Population Health Research Institute, Department of Medicine, and Department of Radiology, Hamilton, ON (Canada); Ishai, Amorina; Tawakol, Ahmed [Massachusetts General Hospital and Harvard Medical School, Cardiology Division and Cardiac MR PET CT Program, Boston, MA (United States); Mani, Venkatesh [Icahn School of Medicine at Mount Sinai School of Medicine, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States); Kallend, David [The Medicines Company, Parsippany, NJ (United States); Rudd, James H.F. [University of Cambridge, Division of Cardiovascular Medicine, Cambridge (United Kingdom); Fayad, Zahi A. [Icahn School of Medicine at Mount Sinai School of Medicine, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States); Icahn School of Medicine at Mount Sinai School of Medicine, Hess CSM Building Floor TMII, Rm S1-104, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States)

    2017-01-15

    It remains unclear whether changes in arterial wall inflammation are associated with subsequent changes in the rate of structural progression of atherosclerosis. In this sub-study of the dal-PLAQUE clinical trial, multi-modal imaging was performed using 18-fludeoxyglucose (FDG) positron emission tomography (PET, at 0 and 6 months) and magnetic resonance imaging (MRI, at 0 and 24 months). The primary objective was to determine whether increasing FDG uptake at 6 months predicted atherosclerosis progression on MRI at 2 years. Arterial inflammation was measured by the carotid FDG target-to-background ratio (TBR), and atherosclerotic plaque progression was defined as the percentage change in carotid mean wall area (MWA) and mean wall thickness (MWT) on MRI between baseline and 24 months. A total of 42 participants were included in this sub-study. The mean age of the population was 62.5 years, and 12 (28.6 %) were women. In participants with (vs. without) any increase in arterial inflammation over 6 months, the long-term changes in both MWT (% change MWT: 17.49 % vs. 1.74 %, p = 0.038) and MWA (% change MWA: 25.50 % vs. 3.59 %, p = 0.027) were significantly greater. Results remained significant after adjusting for clinical and biochemical covariates. Individuals with no increase in arterial inflammation over 6 months had no significant structural progression of atherosclerosis over 24 months as measured by MWT (p = 0.616) or MWA (p = 0.373). Short-term changes in arterial inflammation are associated with long-term structural atherosclerosis progression. These data support the concept that therapies that reduce arterial inflammation may attenuate or halt progression of atherosclerosis. (orig.)

  19. Short-term changes in arterial inflammation predict long-term changes in atherosclerosis progression

    International Nuclear Information System (INIS)

    Joseph, Philip; Ishai, Amorina; Tawakol, Ahmed; Mani, Venkatesh; Kallend, David; Rudd, James H.F.; Fayad, Zahi A.

    2017-01-01

    It remains unclear whether changes in arterial wall inflammation are associated with subsequent changes in the rate of structural progression of atherosclerosis. In this sub-study of the dal-PLAQUE clinical trial, multi-modal imaging was performed using 18-fludeoxyglucose (FDG) positron emission tomography (PET, at 0 and 6 months) and magnetic resonance imaging (MRI, at 0 and 24 months). The primary objective was to determine whether increasing FDG uptake at 6 months predicted atherosclerosis progression on MRI at 2 years. Arterial inflammation was measured by the carotid FDG target-to-background ratio (TBR), and atherosclerotic plaque progression was defined as the percentage change in carotid mean wall area (MWA) and mean wall thickness (MWT) on MRI between baseline and 24 months. A total of 42 participants were included in this sub-study. The mean age of the population was 62.5 years, and 12 (28.6 %) were women. In participants with (vs. without) any increase in arterial inflammation over 6 months, the long-term changes in both MWT (% change MWT: 17.49 % vs. 1.74 %, p = 0.038) and MWA (% change MWA: 25.50 % vs. 3.59 %, p = 0.027) were significantly greater. Results remained significant after adjusting for clinical and biochemical covariates. Individuals with no increase in arterial inflammation over 6 months had no significant structural progression of atherosclerosis over 24 months as measured by MWT (p = 0.616) or MWA (p = 0.373). Short-term changes in arterial inflammation are associated with long-term structural atherosclerosis progression. These data support the concept that therapies that reduce arterial inflammation may attenuate or halt progression of atherosclerosis. (orig.)

  20. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    Directory of Open Access Journals (Sweden)

    Xia Zhang

    Full Text Available BACKGROUND: Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs (N, local health departments (H, and professional rehabilitation volunteers (V which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. METHODS/FINDINGS: 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%. The early intervention group (NHV-E consisted of 298 survivors who received institutional-based rehabilitation (IBR followed by community-based rehabilitation (CBR; the late intervention group (NHV-L was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI. Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3 and sponaneaous recovery (5.03; 95% CI 1.73-8.34. The effect of NHV-E (11.3, 95% CI 9.0-13.7 was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6. It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. CONCLUSION: Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services

  1. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    Science.gov (United States)

    Zhang, Xia; Reinhardt, Jan D; Gosney, James E; Li, Jianan

    2013-01-01

    Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs) (N), local health departments (H), and professional rehabilitation volunteers (V) which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%). The early intervention group (NHV-E) consisted of 298 survivors who received institutional-based rehabilitation (IBR) followed by community-based rehabilitation (CBR); the late intervention group (NHV-L) was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI). Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3) and sponaneaous recovery (5.03; 95% CI 1.73-8.34). The effect of NHV-E (11.3, 95% CI 9.0-13.7) was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6). It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services in China, and international rehabilitation

  2. Prediction and evaluation of nonlinear site response with potentially liquefiable layers in the area of Nafplion (Peloponnesus, Greece for a repeat of historical earthquakes

    Directory of Open Access Journals (Sweden)

    V. K. Karastathis

    2010-11-01

    Full Text Available We examine the possible non-linear behaviour of potentially liquefiable layers at selected sites located within the expansion area of the town of Nafplion, East Peloponnese, Greece. Input motion is computed for three scenario earthquakes, selected on the basis of historical seismicity data, using a stochastic strong ground motion simulation technique, which takes into account the finite dimensions of the earthquake sources. Site-specific ground acceleration synthetics and soil profiles are then used to evaluate the liquefaction potential at the sites of interest. The activation scenario of the Iria fault, which is the closest one to Nafplion (M=6.4, is found to be the most hazardous in terms of liquefaction initiation. In this scenario almost all the examined sites exhibit liquefaction features at depths of 6–12 m. For scenario earthquakes at two more distant seismic sources (Epidaurus fault – M6.3; Xylokastro fault – M6.7 strong ground motion amplification phenomena by the shallow soft soil layer are expected to be observed.

  3. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  4. Short-term prediction of local wind conditions

    DEFF Research Database (Denmark)

    Landberg, L.

    2001-01-01

    This paper will describe a system which predicts the expected power output of a number of wind farms. The system is automatic and operates on-line. The paper will quantify the accuracy of the predictions and will also give examples of the performance for specific storm events. An actual...

  5. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  6. Long-term associative learning predicts verbal short-term memory performance

    OpenAIRE

    Jones, Gary; Macken, Bill

    2017-01-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term...

  7. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  8. Long-term outcomes of patients evacuated from hospitals near the Fukushima Daiichi nuclear power plant after the Great East Japan Earthquake.

    Science.gov (United States)

    Igarashi, Yutaka; Tagami, Takashi; Hagiwara, Jun; Kanaya, Takahiro; Kido, Norihiro; Omura, Mariko; Tosa, Ryoichi; Yokota, Hiroyuki

    2018-01-01

    After the accident of the Fukushima Daiichi nuclear power plant due to the Great East Japan Earthquake in March 2011, the Japanese government issued a mandatory evacuation order for people living within a 20 km radius of the nuclear power plant. The aim of the current study was to investigate long-term outcomes of these patients and identify factors related to mortality. Patients who were evacuated from hospitals near the Fukushima Daiichi nuclear power plant to the Aizu Chuo Hospital from 15 to 26 March, 2011 were included in this study. The following data were collected from medical records: age, sex, activities of daily life, hospital they were admitted in at the time of earthquake, distance between the facility and the nuclear power plant, reasons of evacuation and number of transfers. The patient outcomes were collected from medical records and/or investigated on the telephone in January 2012. A total of 97 patients (28 men and 69 women) were transferred from 10 hospitals via ambulances or buses. No patients died or experienced exacerbation during transfer. Median age of the patients was 86 years. Of the total, 36 patients were not able to obey commands, 44 were bed-ridden and 61 were unable to sustain themselves via oral intake of food. Among 86 patients who were followed-up, 41 (48%) died at the end of 2011. Multiple-regression analysis showed that non-oral intake [Hazard Ratio (HR): 6.07, 95% Confidence interval (CI): 1.94-19.0] and male sex [HR: 8.35, 95% CI: 2.14-32.5] had significant impact on mortality. This study found that 48% of the evacuated patients died 9 months after the earthquake and they had significantly higher mortality rate than the nursing home residents. Non-oral intake and male sex had significant impact on mortality. These patients should be considered as especially vulnerable in case of hospital evacuation.

  9. Long-term characteristics of geological conditions in Japan. Pt. 1. Fundamental concept for future's prediction of geological conditions and the subjects

    International Nuclear Information System (INIS)

    Tanaka, Kazuhiro; Chigira, Masahiro.

    1997-01-01

    It is very important to evaluate the long-term stability of geological conditions such as volcanic activity, uplift-subsidence, earthquakes, faulting and sea level change when the long-term safety performance of HLW geological disposal is investigated. We proposed the extrapolation method using the geological date obtained in the geologic time of the last 500 ka to predict the future's tectonic movements in Japan. Furthermore, we extract geological conditions that would affect the long-term safety of HLW geological disposal with regard to direct and indirect radionuclide release scenarios. As a result, it was concluded that volcanic activity and tectonic movements including faulting and uplift-subsidence, should be considered and their surveying system and evaluating method should be developed. (author)

  10. Resource loss, self-efficacy, and family support predict posttraumatic stress symptoms: a 3-year study of earthquake survivors.

    Science.gov (United States)

    Warner, Lisa Marie; Gutiérrez-Doña, Benicio; Villegas Angulo, Maricela; Schwarzer, Ralf

    2015-01-01

    Social support and self-efficacy are regarded as coping resources that may facilitate readjustment after traumatic events. The 2009 Cinchona earthquake in Costa Rica serves as an example for such an event to study resources to prevent subsequent severity of posttraumatic stress symptoms. At Time 1 (1-6 months after the earthquake in 2009), N=200 survivors were interviewed, assessing resource loss, received family support, and posttraumatic stress response. At Time 2 in 2012, severity of posttraumatic stress symptoms and general self-efficacy beliefs were assessed. Regression analyses estimated the severity of posttraumatic stress symptoms accounted for by all variables. Moderator and mediator models were examined to understand the interplay of received family support and self-efficacy with posttraumatic stress symptoms. Baseline posttraumatic stress symptoms and resource loss (T1) accounted for significant but small amounts of the variance in the severity of posttraumatic stress symptoms (T2). The main effects of self-efficacy (T2) and social support (T1) were negligible, but social support buffered resource loss, indicating that only less supported survivors were affected by resource loss. Self-efficacy at T2 moderated the support-stress relationship, indicating that low levels of self-efficacy could be compensated by higher levels of family support. Receiving family support at T1 enabled survivors to feel self-efficacious, underlining the enabling hypothesis. Receiving social support from relatives shortly after an earthquake was found to be an important coping resource, as it alleviated the association between resource loss and the severity of posttraumatic stress response, compensated for deficits of self-efficacy, and enabled self-efficacy, which was in turn associated with more adaptive adjustment 3 years after the earthquake.

  11. An Artificial Neural Network Based Short-term Dynamic Prediction of Algae Bloom

    Directory of Open Access Journals (Sweden)

    Yao Junyang

    2014-06-01

    Full Text Available This paper proposes a method of short-term prediction of algae bloom based on artificial neural network. Firstly, principal component analysis is applied to water environmental factors in algae bloom raceway ponds to get main factors that influence the formation of algae blooms. Then, a model of short-term dynamic prediction based on neural network is built with the current chlorophyll_a values as input and the chlorophyll_a values in the next moment as output to realize short-term dynamic prediction of algae bloom. Simulation results show that the model can realize short-term prediction of algae bloom effectively.

  12. Major depressive disorder subtypes to predict long-term course

    Science.gov (United States)

    van Loo, Hanna M.; Cai, Tianxi; Gruber, Michael J.; Li, Junlong; de Jonge, Peter; Petukhova, Maria; Rose, Sherri; Sampson, Nancy A.; Schoevers, Robert A.; Wardenaar, Klaas J.; Wilcox, Marsha A.; Al-Hamzawi, Ali Obaid; Andrade, Laura Helena; Bromet, Evelyn J.; Bunting, Brendan; Fayyad, John; Florescu, Silvia E.; Gureje, Oye; Hu, Chiyi; Huang, Yueqin; Levinson, Daphna; Medina-Mora, Maria Elena; Nakane, Yoshibumi; Posada-Villa, Jose; Scott, Kate M.; Xavier, Miguel; Zarkov, Zahari; Kessler, Ronald C.

    2016-01-01

    Background Variation in course of major depressive disorder (MDD) is not strongly predicted by existing subtype distinctions. A new subtyping approach is considered here. Methods Two data mining techniques, ensemble recursive partitioning and Lasso generalized linear models (GLMs) followed by k-means cluster analysis, are used to search for subtypes based on index episode symptoms predicting subsequent MDD course in the World Mental Health (WMH) Surveys. The WMH surveys are community surveys in 16 countries. Lifetime DSM-IV MDD was reported by 8,261 respondents. Retrospectively reported outcomes included measures of persistence (number of years with an episode; number of with an episode lasting most of the year) and severity (hospitalization for MDD; disability due to MDD). Results Recursive partitioning found significant clusters defined by the conjunctions of early onset, suicidality, and anxiety (irritability, panic, nervousness-worry-anxiety) during the index episode. GLMs found additional associations involving a number of individual symptoms. Predicted values of the four outcomes were strongly correlated. Cluster analysis of these predicted values found three clusters having consistently high, intermediate, or low predicted scores across all outcomes. The high-risk cluster (30.0% of respondents) accounted for 52.9-69.7% of high persistence and severity and was most strongly predicted by index episode severe dysphoria, suicidality, anxiety, and early onset. A total symptom count, in comparison, was not a significant predictor. Conclusions Despite being based on retrospective reports, results suggest that useful MDD subtyping distinctions can be made using data mining methods. Further studies are needed to test and expand these results with prospective data. PMID:24425049

  13. Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, S.; Ogata, Y.

    2015-12-01

    Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.

  14. Next-Term Student Performance Prediction: A Recommender Systems Approach

    Science.gov (United States)

    Sweeney, Mack; Rangwala, Huzefa; Lester, Jaime; Johri, Aditya

    2016-01-01

    An enduring issue in higher education is student retention to successful graduation. National statistics indicate that most higher education institutions have four-year degree completion rates around 50%, or just half of their student populations. While there are prediction models which illuminate what factors assist with college student success,…

  15. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  16. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  17. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  18. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  19. Autoregressive models as a tool to discriminate chaos from randomness in geoelectrical time series: an application to earthquake prediction

    Directory of Open Access Journals (Sweden)

    C. Serio

    1997-06-01

    Full Text Available The time dynamics of geoelectrical precursory time series has been investigated and a method to discriminate chaotic behaviour in geoelectrical precursory time series is proposed. It allows us to detect low-dimensional chaos when the only information about the time series comes from the time series themselves. The short-term predictability of these time series is evaluated using two possible forecasting approaches: global autoregressive approximation and local autoregressive approximation. The first views the data as a realization of a linear stochastic process, whereas the second considers the data points as a realization of a deterministic process, supposedly non-linear. The comparison of the predictive skill of the two techniques is a test to discriminate between low-dimensional chaos and random dynamics. The analyzed time series are geoelectrical measurements recorded by an automatic station located in Tito (Southern Italy in one of the most seismic areas of the Mediterranean region. Our findings are that the global (linear approach is superior to the local one and the physical system governing the phenomena of electrical nature is characterized by a large number of degrees of freedom. Power spectra of the filtered time series follow a P(f = F-a scaling law: they exhibit the typical behaviour of a broad class of fractal stochastic processes and they are a signature of the self-organized systems.

  20. Predicting short-term stock fluctuations by using processing fluency

    Science.gov (United States)

    Alter, Adam L.; Oppenheimer, Daniel M.

    2006-01-01

    Three studies investigated the impact of the psychological principle of fluency (that people tend to prefer easily processed information) on short-term share price movements. In both a laboratory study and two analyses of naturalistic real-world stock market data, fluently named stocks robustly outperformed stocks with disfluent names in the short term. For example, in one study, an initial investment of $1,000 yielded a profit of $112 more after 1 day of trading for a basket of fluently named shares than for a basket of disfluently named shares. These results imply that simple, cognitive approaches to modeling human behavior sometimes outperform more typical, complex alternatives. PMID:16754871

  1. Use of "Crowd-Sourcing" and other collaborations to solve the short-term, earthquake forecasting problem

    Science.gov (United States)

    Bleier, T.; Heraud, J. A.; Dunson, J. C.

    2015-12-01

    QuakeFinder (QF) and its international collaborators have installed and currently maintain 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. The data from these instruments are being analyzed for pre-quake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, PUCC in Chile, NOA in Greece, Syiah Kuala University in Indonesia, LASP at U of Colo., Stanford, and USGS). Recently, NASA Hq and QuakeFinder tried a new approach to help with the analysis of this huge (50+TB) data archive. A collaboration with Apirio/TopCoder, Harvard University, Amazon, QuakeFinder, and NASA Hq. resulted in an open algorithm development contest called "Quest for Quakes" in which contestants (freelance algorithm developers) attempted to identify quakes from a subset of the QuakeFinder data (3TB). The contest included a $25K prize pool, and contained 100 cases where earthquakes (and null sets) included data from up to 5 remote sites, near and far from quakes greater than M4. These data sets were made available through Amazon.com to hundreds of contestants over a two week contest period. In a more traditional approach, several new algorithms were tried by actively sharing the QF data with universities over a longer period. These algorithms included Principal Component Analysis-PCA and deep neural networks in an effort to automatically identify earthquake signals within typical, noise-filled environments. This presentation examines the pros and cons of employing these two approaches, from both logistical and scientific perspectives.

  2. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  3. Modeling Seismic Cycles of Great Megathrust Earthquakes Across the Scales With Focus at Postseismic Phase

    Science.gov (United States)

    Sobolev, Stephan V.; Muldashev, Iskander A.

    2017-12-01

    Subduction is substantially multiscale process where the stresses are built by long-term tectonic motions, modified by sudden jerky deformations during earthquakes, and then restored by following multiple relaxation processes. Here we develop a cross-scale thermomechanical model aimed to simulate the subduction process from 1 min to million years' time scale. The model employs elasticity, nonlinear transient viscous rheology, and rate-and-state friction. It generates spontaneous earthquake sequences and by using an adaptive time step algorithm, recreates the deformation process as observed naturally during the seismic cycle and multiple seismic cycles. The model predicts that viscosity in the mantle wedge drops by more than three orders of magnitude during the great earthquake with a magnitude above 9. As a result, the surface velocities just an hour or day after the earthquake are controlled by viscoelastic relaxation in the several hundred km of mantle landward of the trench and not by the afterslip localized at the fault as is currently believed. Our model replicates centuries-long seismic cycles exhibited by the greatest earthquakes and is consistent with the postseismic surface displacements recorded after the Great Tohoku Earthquake. We demonstrate that there is no contradiction between extremely low mechanical coupling at the subduction megathrust in South Chile inferred from long-term geodynamic models and appearance of the largest earthquakes, like the Great Chile 1960 Earthquake.

  4. Error analysis of short term wind power prediction models

    International Nuclear Information System (INIS)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco

    2011-01-01

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  5. Error analysis of short term wind power prediction models

    Energy Technology Data Exchange (ETDEWEB)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco [Dipartimento di Ingegneria dell' Innovazione, Universita del Salento, Via per Monteroni, 73100 Lecce (Italy)

    2011-04-15

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  6. Applicability of short-term accelerated biofouling studies to predict long-term biofouling accumulation in reverse osmosis membrane systems

    KAUST Repository

    Sanawar, Huma

    2018-02-02

    Biofouling studies addressing biofouling control are mostly executed in short-term studies. It is unclear whether data collected from these experiments are representative for long-term biofouling as occurring in full-scale membrane systems. This study investigated whether short-term biofouling studies accelerated by biodegradable nutrient dosage to feed water were predictive for long-term biofouling development without nutrient dosage. Since the presence of a feed spacer has an strong effect on the degree of biofouling, this study employed six geometrically different feed spacers. Membrane fouling simulators (MFSs) were operated with the same (i) membrane, (ii) feed flow and (iii) feed water, but with feed spacers varying in geometry. For the short-term experiment, biofilm formation was enhanced by nutrient dosage to the MFS feed water, whereas no nutrient dosage was applied in the long-term experiment. Pressure drop development was monitored to characterize the extent of biofouling, while the accumulated viable biomass content at the end of the experimental run was quantified by adenosine triphosphate (ATP) measurements. Impact of feed spacer geometry on biofouling was compared for the short-term and long-term biofouling study. The results of the study revealed that the feed spacers exhibited the same biofouling behavior for (i) the short-term (9-d) study with nutrient dosage and (ii) the long-term (96-d) study without nutrient dosage. For the six different feed spacers, the accumulated viable biomass content (pg ATP.cm) was roughly the same, but the biofouling impact in terms of pressure drop increase in time was significantly different. The biofouling impact ranking of the six feed spacers was the same for the short-term and long-term biofouling studies. Therefore, it can be concluded that short-term accelerated biofouling studies in MFSs are a representative and suitable approach for the prediction of biofouling in membrane filtration systems after long-term

  7. Role of Subdural Electrocorticography in Prediction of Long-Term Seizure Outcome in Epilepsy Surgery

    Science.gov (United States)

    Asano, Eishi; Juhasz, Csaba; Shah, Aashit; Sood, Sandeep; Chugani, Harry T.

    2009-01-01

    Since prediction of long-term seizure outcome using preoperative diagnostic modalities remains suboptimal in epilepsy surgery, we evaluated whether interictal spike frequency measures obtained from extraoperative subdural electrocorticography (ECoG) recording could predict long-term seizure outcome. This study included 61 young patients (age…

  8. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  9. Multi-component observation in deep boreholes, and its applications to earthquake prediction research and rock mechanics

    International Nuclear Information System (INIS)

    Ishii, Hiroshi

    2014-01-01

    The Tono Research Institute of Earthquake Science (TRIES) has developed a multicomponent instrument that can be operated in deep boreholes (e.g., those one km in depth). It is equipped with stress meters, strain meters, tilt meters, seismometers, magnetometers, and thermometers; in addition, these sensors can be arbitrarily combined. The stress meters, which were developed recently, can observe stress and strain; in the future, data obtained from these sensors will offer new information on seismology and rock mechanics. The size of typical probe is 12 cm diameter 7.8 m total length and 290 kg total weight. It consists of many meters in tandem connection. (authors)

  10. Analysis of Earthquake Catalogs for CSEP Testing Region Italy

    International Nuclear Information System (INIS)

    Peresan, A.; Romashkova, L.; Nekrasova, A.; Kossobokov, V.; Panza, G.F.

    2010-07-01

    A comprehensive analysis shows that the set of catalogs provided by the Istituto Nazionale di Geofisica e Vulcanologia (INGV, Italy) as the authoritative database for the Collaboratory for the Study of Earthquake Predictability - Testing Region Italy (CSEP-TRI), is hardly a unified one acceptable for the necessary tuning of models/algorithms, as well as for running rigorous prospective predictability tests at intermediate- or long-term scale. (author)

  11. Long-term predictive capability of erosion models

    Science.gov (United States)

    Veerabhadra, P.; Buckley, D. H.

    1983-01-01

    A brief overview of long-term cavitation and liquid impingement erosion and modeling methods proposed by different investigators, including the curve-fit approach is presented. A table was prepared to highlight the number of variables necessary for each model in order to compute the erosion-versus-time curves. A power law relation based on the average erosion rate is suggested which may solve several modeling problems.

  12. Tokyo Metropolitan Earthquake Preparedness Project - A Progress Report

    Science.gov (United States)

    Hayashi, H.

    2010-12-01

    Munich Re once ranked that Tokyo metropolitan region, the capital of Japan, is the most vulnerable area for earthquake disasters, followed by San Francisco Bay Area, US and Osaka, Japan. Seismologists also predict that Tokyo metropolitan region may have at least one near-field earthquake with a probability of 70% for the next 30 years. Given this prediction, Japanese Government took it seriously to conduct damage estimations and revealed that, as the worst case scenario, if a7.3 magnitude earthquake under heavy winds as shown in the fig. 1, it would kill a total of 11,000 people and a total of direct and indirect losses would amount to 112,000,000,000,000 yen(1,300,000,000,000, 1=85yen) . In addition to mortality and financial losses, a total of 25 million people would be severely impacted by this earthquake in four prefectures. If this earthquake occurs, 300,000 elevators will be stopped suddenly, and 12,500 persons would be confined in them for a long time. Seven million people will come to use over 20,000 public shelters spread over the impacted area. Over one millions temporary housing units should be built to accommodate 4.6 million people who lost their dwellings. 2.5 million people will relocate to outside of the damaged area. In short, an unprecedented scale of earthquake disaster is expected and we must prepare for it. Even though disaster mitigation is undoubtedly the best solution, it is more realistic that the expected earthquake would hit before we complete this business. In other words, we must take into account another solution to make the people and the assets in this region more resilient for the Tokyo metropolitan earthquake. This is the question we have been tackling with for the last four years. To increase societal resilience for Tokyo metropolitan earthquake, we adopted a holistic approach to integrate both emergency response and long-term recovery. There are three goals for long-term recovery, which consists of Physical recovery, Economic

  13. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    International Nuclear Information System (INIS)

    Lee, Gi Hwa

    1997-11-01

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances

  14. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  15. The Barrier code for predicting long-term concrete performance

    International Nuclear Information System (INIS)

    Shuman, R.; Rogers, V.C.; Shaw, R.A.

    1989-01-01

    There are numerous features incorporated into a LLW disposal facility that deal directly with critical safety objectives required by the NRC in 10 CFR 61. Engineered barriers or structures incorporating concrete are commonly being considered for waste disposal facilities. The Barrier computer code calculates the long-term degradation of concrete structures in LLW disposal facilities. It couples this degradation with water infiltration into the facility, nuclide leaching from the waste, contaminated water release from the facility, and associated doses to members of the critical population group. The concrete degradation methodology of Barrier is described

  16. The Comparison Study of Short-Term Prediction Methods to Enhance the Model Predictive Controller Applied to Microgrid Energy Management

    Directory of Open Access Journals (Sweden)

    César Hernández-Hernández

    2017-06-01

    Full Text Available Electricity load forecasting, optimal power system operation and energy management play key roles that can bring significant operational advantages to microgrids. This paper studies how methods based on time series and neural networks can be used to predict energy demand and production, allowing them to be combined with model predictive control. Comparisons of different prediction methods and different optimum energy distribution scenarios are provided, permitting us to determine when short-term energy prediction models should be used. The proposed prediction models in addition to the model predictive control strategy appear as a promising solution to energy management in microgrids. The controller has the task of performing the management of electricity purchase and sale to the power grid, maximizing the use of renewable energy sources and managing the use of the energy storage system. Simulations were performed with different weather conditions of solar irradiation. The obtained results are encouraging for future practical implementation.

  17. Report by the 'Mega-earthquakes and mega-tsunamis' subgroup; Rapport du sous-groupe Sismique 'Megaseismes et megatsunamis'

    Energy Technology Data Exchange (ETDEWEB)

    Friedel, Jacques; Courtillot, Vincent; Dercourt, Jean; Jaupart, Claude; Le Pichon, Xavier; Poirier, Jean-Paul; Salencon, Jean; Tapponnier, Paul; Dautray, Robert; Carpentier, Alain; Taquet, Philippe; Blanchet, Rene; Le Mouel, Jean-Louis [Academie des sciences, 23, quai de Conti, 75006 Paris (France); BARD, Pierre-Yves [Observatoire des sciences de l' Univers de l' universite de Grenoble - OSUG, Universite Joseph Fourier, BP 53, 38041 Grenoble Cedex 9 (France); Bernard, Pascal; Montagner, Jean-Paul; Armijo, Rolando; Shapiro, Nikolai; Tait, Steve [Institut de physique du globe de Paris, 1, rue Jussieu - 75238 Paris cedex 05 (France); Cara, Michel [Ecole et Observatoire des sciences de la Terre de l' universite de Strasbourg - EOST, F-67084 Strasbourg cedex (France); Madariaga, Raul [Ecole normale superieure, 45, rue d' Ulm / 29, rue d' Ulm, F-75230 Paris cedex 05 (France); Pecker, Alain [Academie des technologies, Grand Palais des Champs Elysees - Porte C - Avenue Franklin D. Roosevelt - 75008 Paris (France); Schindele, Francois [CEA/DAM, DIF/DASE/SLDG, 91297 ARPAJON Cedex (France); Douglas, John [BRGM, 3 avenue Claude-Guillemin - BP 36009 - 45060 Orleans Cedex 2 (France)

    2011-06-15

    This report comprises a presentation of scientific data on subduction earthquakes, on tsunamis and on the Tohoku earthquake. It proposes a detailed description of the French situation (in the West Indies, in metropolitan France, and in terms of soil response), and a discussion of social and economic issues (governance, seismic regulation and nuclear safety, para-seismic protection of constructions). The report is completed by other large documents: presentation of data on the Japanese earthquake, discussion on prediction and governance errors in the management of earthquake mitigation in Japan, discussions on tsunami prevention, on needs of research on accelerometers, and on the seismic risk in France

  18. Operator aids for prediction of source term attenuation

    International Nuclear Information System (INIS)

    Powers, D.A.

    2004-01-01

    Simplified expressions for the attenuation of radionuclide releases by sprays and by water pools are devised. These expressions are obtained by correlation of the 10th, 50th and 90th percentiles of uncertainty distributions for the water pool decontamination factor and the spray decontamination coefficient. These uncertainty distributions were obtained by Monte Carlo uncertainty analyses using detailed, mechanistic models of the pools and sprays. Uncertainties considered in the analyses include uncertainties in the phenomena and uncertainties in the initial and boundary conditions dictated by the progression of severe accidents. Final results are graphically displayed in terms of the decontamination factor achieved at selected levels of conservatism versus pool depth and water subcooling or, in the case of sprays, versus time. (author)

  19. The prediction of the long-term behaviour of glasses

    International Nuclear Information System (INIS)

    Courtois, Ch.; Regent, A.; Plas, F.

    1997-01-01

    Several experts draw a conclusion about the scientific content of this week-long seminar. All agree to highlight the variety and quality of the work done. It appears that there is a consensus about the phenomenology of the long-term behaviour of glasses. All the parameters that are likely to intervene in alteration processes have been identified, but some particular points require further studies: - the impact of alpha, beta and gamma irradiation, - the alteration of glass in no-saturated water, - the coupling effect with the materials surrounding glass (metal canister, over-container...), - the optimization of glass composition to deal with high burn-up spent fuels, - the relation between the formation free energy of glasses and their alteration kinetics, - the release of radionuclides trapped in glass, and - the use of mutual analogue. (A.C.)

  20. Gestational weight gain among minority adolescents predicts term birth weight.

    Science.gov (United States)

    Ekambaram, Maheswari; Irigoyen, Matilde; DeFreitas, Johelin; Rajbhandari, Sharina; Geaney, Jessica Lynn; Braitman, Leonard Edward

    2018-03-07

    In adolescents, there is limited evidence on the independent and additive effect of prepregnancy body mass index (BMI) and gestational weight gain on infant birth weight. Data also show that this effect may vary by race. We sought to examine the impact of maternal prepregnancy BMI and gestational weight gain on birth weight and risk of large for gestational age (LGA) in term newborns of minority adolescent mothers. This was a retrospective cohort study of 411 singleton live term infants born to mothers ≤ 18 years. Data were abstracted from electronic medical records. Gestational weight gain was related to infant birth weight (ρ = 0.36, P gain, gestational age and Hispanic ethnicity were independent predictors of birth weight, controlling for maternal age, BMI, parity, tobacco/drug use and preeclampsia. The probability of having an LGA infant increased with weight gain [adjusted odds ratio (aOR) 1.14, 95% confidence interval (CI) 1.07-1.21] but not with BMI. Mothers who gained weight in excess of 2009 Institute of Medicine (IOM) recommendations had a greater risk of having an LGA infant compared to those who gained within recommendations (aOR 5.7, 95% CI 1.6-19.5). Minority adolescents with greater gestational weight gain had infants with higher birth weight and greater risk of LGA; BMI was not associated with either outcome. Further studies are needed to examine the applicability of the 2009 BMI-specific IOM gestational weight gain recommendations to adolescents in minority populations.

  1. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  2. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  3. Artificial intelligence to predict short-term wind speed

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Tiago; Soares, Joao; Ramos, Sergio; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - ISEP

    2012-07-01

    The use of renewable energy is increasing exponentially in many countries due to the introduction of new energy and environmental policies. Thus, the focus on energy and on the environment makes the efficient integration of renewable energy into the electric power system extremely important. Several European countries have been seeing a high penetration of wind power, representing, gradually, a significant penetration on electricity generation. The introduction of wind power in the network power system causes new challenges for the power system operator due to the variability and uncertainty in weather conditions and, consequently, in the wind power generation. As result, the scheduling dispatch has a significantly portion of uncertainty. In order to deal with the uncertainty in wind power and, with that, introduce improvements in the power system operator efficiency, the wind power forecasting may reveal as a useful tool. This paper proposes a data-mining-based methodology to forecast wind speed. This method is based on the use of data mining techniques applied to a real database of historical wind data. The paper includes a case study based on a real database regarding the last three years to predict wind speed at 5 minute intervals. (orig.)

  4. Visual short term memory related brain activity predicts mathematical abilities.

    Science.gov (United States)

    Boulet-Craig, Aubrée; Robaey, Philippe; Lacourse, Karine; Jerbi, Karim; Oswald, Victor; Krajinovic, Maja; Laverdière, Caroline; Sinnett, Daniel; Jolicoeur, Pierre; Lippé, Sarah

    2017-07-01

    Previous research suggests visual short-term memory (VSTM) capacity and mathematical abilities are significantly related. Moreover, both processes activate similar brain regions within the parietal cortex, in particular, the intraparietal sulcus; however, it is still unclear whether the neuronal underpinnings of VSTM directly correlate with mathematical operation and reasoning abilities. The main objective was to investigate the association between parieto-occipital brain activity during the retention period of a VSTM task and performance in mathematics. The authors measured mathematical abilities and VSTM capacity as well as brain activity during memory maintenance using magnetoencephalography (MEG) in 19 healthy adult participants. Event-related magnetic fields (ERFs) were computed on the MEG data. Linear regressions were used to estimate the strength of the relation between VSTM related brain activity and mathematical abilities. The amplitude of parieto-occipital cerebral activity during the retention of visual information was related to performance in 2 standardized mathematical tasks: mathematical reasoning and calculation fluency. The findings show that brain activity during retention period of a VSTM task is associated with mathematical abilities. Contributions of VSTM processes to numerical cognition should be considered in cognitive interventions. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Short-Term changes on MRI predict long-Term changes on radiography in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Peterfy, Charles; Strand, Vibeke; Tian, Lu

    2017-01-01

    Objective In rheumatoid arthritis (RA), MRI provides earlier detection of structural damage than radiography (X-ray) and more sensitive detection of intra-Articular inflammation than clinical examination. This analysis was designed to evaluate the ability of early MRI findings to predict subsequent...

  6. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  7. SHORT-TERM AND LONG-TERM WATER LEVEL PREDICTION AT ONE RIVER MEASUREMENT LOCATION

    Directory of Open Access Journals (Sweden)

    Rudolf Scitovski

    2012-12-01

    Full Text Available Global hydrological cycles mainly depend on climate changes whose occurrence is predominantly triggered by solar and terrestrial influence, and the knowledge of the high water regime is widely applied in hydrology. Regular monitoring and studying of river water level behavior is important from several perspectives. On the basis of the given data, by using modifications of general approaches known from literature, especially from investigation in hydrology, the problem of long- and short-term water level forecast at one river measurement location is considered in the paper. Long-term forecasting is considered as the problem of investigating the periodicity of water level behavior by using linear-trigonometric regression and short-term forecasting is based on the modification of the nearest neighbor method. The proposed methods are tested on data referring to the Drava River level by Donji Miholjac, Croatia, in the period between the beginning of 1900 and the end of 2012.

  8. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  9. Predicting Short-Term Subway Ridership and Prioritizing Its Influential Factors Using Gradient Boosting Decision Trees

    Directory of Open Access Journals (Sweden)

    Chuan Ding

    2016-10-01

    Full Text Available Understanding the relationship between short-term subway ridership and its influential factors is crucial to improving the accuracy of short-term subway ridership prediction. Although there has been a growing body of studies on short-term ridership prediction approaches, limited effort is made to investigate the short-term subway ridership prediction considering bus transfer activities and temporal features. To fill this gap, a relatively recent data mining approach called gradient boosting decision trees (GBDT is applied to short-term subway ridership prediction and used to capture the associations with the independent variables. Taking three subway stations in Beijing as the cases, the short-term subway ridership and alighting passengers from its adjacent bus stops are obtained based on transit smart card data. To optimize the model performance with different combinations of regularization parameters, a series of GBDT models are built with various learning rates and tree complexities by fitting a maximum of trees. The optimal model performance confirms that the gradient boosting approach can incorporate different types of predictors, fit complex nonlinear relationships, and automatically handle the multicollinearity effect with high accuracy. In contrast to other machine learning methods—or “black-box” procedures—the GBDT model can identify and rank the relative influences of bus transfer activities and temporal features on short-term subway ridership. These findings suggest that the GBDT model has considerable advantages in improving short-term subway ridership prediction in a multimodal public transportation system.

  10. Medium- and Long-term Prediction of LOD Change with the Leap-step Autoregressive Model

    Science.gov (United States)

    Liu, Q. B.; Wang, Q. J.; Lei, M. F.

    2015-09-01

    It is known that the accuracies of medium- and long-term prediction of changes of length of day (LOD) based on the combined least-square and autoregressive (LS+AR) decrease gradually. The leap-step autoregressive (LSAR) model is more accurate and stable in medium- and long-term prediction, therefore it is used to forecast the LOD changes in this work. Then the LOD series from EOP 08 C04 provided by IERS (International Earth Rotation and Reference Systems Service) is used to compare the effectiveness of the LSAR and traditional AR methods. The predicted series resulted from the two models show that the prediction accuracy with the LSAR model is better than that from AR model in medium- and long-term prediction.

  11. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults.

    Science.gov (United States)

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-08-07

    Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (PPeruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  12. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults

    Science.gov (United States)

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-01-01

    Background Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. Methods and Results We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (PPeruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. PMID:26254303

  13. Medium- and Long-term Prediction of LOD Change by the Leap-step Autoregressive Model

    Science.gov (United States)

    Wang, Qijie

    2015-08-01

    The accuracy of medium- and long-term prediction of length of day (LOD) change base on combined least-square and autoregressive (LS+AR) deteriorates gradually. Leap-step autoregressive (LSAR) model can significantly reduce the edge effect of the observation sequence. Especially, LSAR model greatly improves the resolution of signals’ low-frequency components. Therefore, it can improve the efficiency of prediction. In this work, LSAR is used to forecast the LOD change. The LOD series from EOP 08 C04 provided by IERS is modeled by both the LSAR and AR models. The results of the two models are analyzed and compared. When the prediction length is between 10-30 days, the accuracy improvement is less than 10%. When the prediction length amounts to above 30 day, the accuracy improved obviously, with the maximum being around 19%. The results show that the LSAR model has higher prediction accuracy and stability in medium- and long-term prediction.

  14. Early Seizure Frequency and Aetiology Predict Long-Term Medical Outcome in Childhood-Onset Epilepsy

    Science.gov (United States)

    Sillanpaa, Matti; Schmidt, Dieter

    2009-01-01

    In clinical practice, it is important to predict as soon as possible after diagnosis and starting treatment, which children are destined to develop medically intractable seizures and be at risk of increased mortality. In this study, we determined factors predictive of long-term seizure and mortality outcome in a population-based cohort of 102…

  15. Prediction of long term stability for geological disposal of radioactive waste

    International Nuclear Information System (INIS)

    Sasaki, Takeshi; Morikawa, Seiji; Koide, Hitoshi; Kono, Itoshi

    1998-01-01

    On geological disposal of radioactive wastes, study on prediction of diastrophism has been paid many attentions, and then long term future prediction ranging from some thousands to some tends thousands years may be necessary for some target nuclides. As there are various methods in the future prediction, it is essential to use a computational dynamic procedure to conduct a quantitative prediction. However, it causes an obstacle to advancement of the prediction method that informations on deep underground have a lot of uncertain elements because of their few and indirect data. In this paper, a long term prediction procedure of diastrophism relating to geological disposal of radioactive wastes with low level but isolation terms required to some thousands years was investigated and each one example was shown on flow of the investigation and its modeling method by using the finite element method. It seems to be a key to upgrade accuracy of future diastrophism prediction how an earth fault can be analyzed. And, as the diastrophism is a long term and complex phenomenon and its prediction has many uncertain elements, it is important to judge comprehensively results of its numerical analysis geologically and on rock engineering. (G.K.)

  16. Predicting the liquefaction phenomena from shear velocity profiling: Empirical approach to 6.3 Mw, May 2006 Yogyakarta earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hartantyo, Eddy, E-mail: hartantyo@ugm.ac.id [PhD student, Physics Department, FMIPA, UGM. Sekip Utara Yogyakarta 55281 Indonesia (Indonesia); Brotopuspito, Kirbani S.; Sismanto; Waluyo [Geophysics Laboratory, FMIPA, Universitas Gadjah Mada, Sekip Utara Yogyakarta 55281 (Indonesia)

    2015-04-24

    The liquefactions phenomena have been reported after a shocking 6.5Mw earthquake hit Yogyakarta province in the morning at 27 May 2006. Several researchers have reported the damage, casualties, and soil failure due to the quake, including the mapping and analyzing the liquefaction phenomena. Most of them based on SPT test. The study try to draw the liquefaction susceptibility by means the shear velocity profiling using modified Multichannel Analysis of Surface Waves (MASW). This paper is a preliminary report by using only several measured MASW points. The study built 8-channel seismic data logger with 4.5 Hz geophones for this purpose. Several different offsets used to record the high and low frequencies of surface waves. The phase-velocity diagrams were stacked in the frequency domain rather than in time domain, for a clearer and easier dispersion curve picking. All codes are implementing in Matlab. From these procedures, shear velocity profiling was collected beneath each geophone’s spread. By mapping the minimum depth of shallow water table, calculating PGA with soil classification, using empirical formula for saturated soil weight from shear velocity profile, and calculating CRR and CSR at every depth, the liquefaction characteristic can be identify in every layer. From several acquired data, a liquefiable potential at some depth below water table was obtained.

  17. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  18. Long‐Term Post‐CABG Survival: Performance of Clinical Risk Models Versus Actuarial Predictions

    Science.gov (United States)

    Carr, Brendan M.; Romeiser, Jamie; Ruan, Joyce; Gupta, Sandeep; Seifert, Frank C.; Zhu, Wei

    2015-01-01

    Abstract Background/aim Clinical risk models are commonly used to predict short‐term coronary artery bypass grafting (CABG) mortality but are less commonly used to predict long‐term mortality. The added value of long‐term mortality clinical risk models over traditional actuarial models has not been evaluated. To address this, the predictive performance of a long‐term clinical risk model was compared with that of an actuarial model to identify the clinical variable(s) most responsible for any differences observed. Methods Long‐term mortality for 1028 CABG patients was estimated using the Hannan New York State clinical risk model and an actuarial model (based on age, gender, and race/ethnicity). Vital status was assessed using the Social Security Death Index. Observed/expected (O/E) ratios were calculated, and the models' predictive performances were compared using a nested c‐index approach. Linear regression analyses identified the subgroup of risk factors driving the differences observed. Results Mortality rates were 3%, 9%, and 17% at one‐, three‐, and five years, respectively (median follow‐up: five years). The clinical risk model provided more accurate predictions. Greater divergence between model estimates occurred with increasing long‐term mortality risk, with baseline renal dysfunction identified as a particularly important driver of these differences. Conclusions Long‐term mortality clinical risk models provide enhanced predictive power compared to actuarial models. Using the Hannan risk model, a patient's long‐term mortality risk can be accurately assessed and subgroups of higher‐risk patients can be identified for enhanced follow‐up care. More research appears warranted to refine long‐term CABG clinical risk models. doi: 10.1111/jocs.12665 (J Card Surg 2016;31:23–30) PMID:26543019

  19. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  20. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  1. Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.

    Science.gov (United States)

    Limongi, Roberto; Silva, Angélica M

    2016-11-01

    The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.

  2. Event terms in the response spectra prediction equation and their deviation due to stress drop variations

    Science.gov (United States)

    Kawase, H.; Nakano, K.

    2015-12-01

    We investigated the characteristics of strong ground motions separated from acceleration Fourier spectra and acceleration response spectra of 5% damping calculated from weak and moderate ground motions observed by K-NET, KiK-net, and the JMA Shindokei Network in Japan using the generalized spectral inversion method. The separation method used the outcrop motions at YMGH01 as reference where we extracted site responses due to shallow weathered layers. We include events with JMA magnitude equal to or larger than 4.5 observed from 1996 to 2011. We find that our frequency-dependent Q values are comparable to those of previous studies. From the corner frequencies of Fourier source spectra, we calculate Brune's stress parameters and found a clear magnitude dependence, in which smaller events tend to spread over a wider range while maintaining the same maximum value. We confirm that this is exactly the case for several mainshock-aftershock sequences. The average stress parameters for crustal earthquakes are much smaller than those of subduction zone, which can be explained by their depth dependence. We then compared the strong motion characteristics based on the acceleration response spectra and found that the separated characteristics of strong ground motions are different, especially in the lower frequency range less than 1Hz. These differences comes from the difference between Fourier spectra and response spectra found in the observed data; that is, predominant components in high frequency range of Fourier spectra contribute to increase the response in lower frequency range with small Fourier amplitude because strong high frequency component acts as an impulse to a Single-Degree-of-Freedom system. After the separation of the source terms for 5% damping response spectra we can obtain regression coefficients with respect to the magnitude, which lead to a new GMPE as shown in Fig.1 on the left. Although stress drops for inland earthquakes are 1/7 of the subduction

  3. Prediction on long-term mean and mean square pollutant concentrations in an urban atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, S; Lamb, R G; Seinfeld, J H

    1976-01-01

    The general problem of predicting long-term average (say yearly) pollutant concentrations in an urban atmosphere is formulated. The pollutant concentration can be viewed as a random process, the complete description of which requires knowledge of its probability density function, which is unknown. The mean concentration is the first moment of the concentration distribution, and at present there exist a number of models for predicting the long-term mean concentration of an inert pollutant. The second moment, or mean square concentration, indicates additional features of the distribution, such as the level of fluctuations about the mean. In the paper a model proposed by Lamb for the long-term mean concentration is reviewed, and a new model for prediction of the long-term mean square concentration of an inert air pollutant is derived. The properties and uses of the model are discussed, and the equations defining the model are presented in a form for direct application to an urban area.

  4. A comparison of the medium-term impact and recovery of the Pakistan floods and the Haiti earthquake: objective and subjective measures.

    Science.gov (United States)

    Weiss, William M; Kirsch, Thomas D; Doocy, Shannon; Perrin, Paul

    2014-06-01

    The 2010 Haiti earthquake and Pakistan floods were similar in their massive human impact. Although the specific events were very different, the humanitarian response to disasters is supposed to achieve the same ends. This paper contrasts the disaster effects and aims to contrast the medium-term response. In January 2011, similarly structured population-based surveys were carried out in the most affected areas using stratified cluster designs (80×20 in Pakistan and 60×20 in Haiti) with probability proportional to size sampling. Displacement persisted in Haiti and Pakistan at 53% and 39% of households, respectively. In Pakistan, 95% of households reported damage to their homes and loss of income or livelihoods, and in Haiti, the rates were 93% and 85%, respectively. Frequency of displacement, and income or livelihood loss, were significantly higher in Pakistan, whereas disaster-related deaths or injuries were significantly more prevalent in Haiti. Given the rise in disaster frequency and costs, and the volatility of humanitarian funding streams as a result of the recent global financial crisis, it is increasingly important to measure the impact of humanitarian response against the goal of a return to normalcy.

  5. Atomic mass prediction from the mass formula with empirical shell terms

    International Nuclear Information System (INIS)

    Uno, Masahiro; Yamada, Masami

    1982-08-01

    The mass-excess prediction of about 8000 nuclides was calculated from two types of the atomic mass formulas with empirical shell terms of Uno and Yamada. The theoretical errors to accompany the calculated mass excess are also presented. These errors have been obtained by a new statistical method. The mass-excess prediction includes the term of the gross feature of a nuclear mass surface, the shell terms and a small correction term for odd-odd nuclei. Two functional forms for the shell terms were used. The first is the constant form, and the sencond is the linear form. In determining the values of shell parameters, only the data of even-even and odd-A nuclei were used. A new statistical method was applied, in which the error inherent to the mass formula was taken account. The obtained shell parameters and the values of mass excess are shown in tables. (Kato, T.)

  6. The seismic cycles of large Romanian earthquake: The physical foundation, and the next large earthquake in Vrancea

    International Nuclear Information System (INIS)

    Purcaru, G.

    2002-01-01

    The occurrence patterns of large/great earthquakes at subduction zone interface and in-slab are complex in the space-time dynamics, and make even long-term forecasts very difficult. For some favourable cases where a predictive (empirical) law was found successful predictions were possible (eg. Aleutians, Kuriles, etc). For the large Romanian events (M > 6.7), occurring in the Vrancea seismic slab below 60 km, Purcaru (1974) first found the law of the occurrence time and magnitude: the law of 'quasicycles' and 'supercycles', for large and largest events (M > 7.25), respectively. The quantitative model of Purcaru with these seismic cycles has three time-bands (periods of large earthquakes)/century, discovered using the earthquake history (1100-1973) (however incomplete) of large Vrancea earthquakes for which M was initially estimated (Purcaru, 1974, 1979). Our long-term prediction model is essentially quasideterministic, it predicts uniquely the time and magnitude; since is not strict deterministic the forecasting is interval valued. It predicted the next large earthquake in 1980 in the 3rd time-band (1970-1990), and which occurred in 1977 (M7.1, M w 7.5). The prediction was successful, in long-term sense. We discuss the unpredicted events in 1986 and 1990. Since the laws are phenomenological, we give their physical foundation based on the large scale of rupture zone (RZ) and subscale of the rupture process (RP). First results show that: (1) the 1940 event (h=122 km) ruptured the lower part of the oceanic slab entirely along strike, and down dip, and similarly for 1977 but its upper part, (2) the RZ of 1977 and 1990 events overlap and the first asperity of 1977 event was rebroken in 1990. This shows the size of the events strongly depends on RZ, asperity size/strength and, thus on the failure stress level (FSL), but not on depth, (3) when FSL of high strength (HS) larger zones is critical largest events (eg. 1802, 1940) occur, thus explaining the supercyles (the 1940

  7. Short-term memory predictions across the lifespan: monitoring span before and after conducting a task.

    Science.gov (United States)

    Bertrand, Julie Marilyne; Moulin, Chris John Anthony; Souchay, Céline

    2017-05-01

    Our objective was to explore metamemory in short-term memory across the lifespan. Five age groups participated in this study: 3 groups of children (4-13 years old), and younger and older adults. We used a three-phase task: prediction-span-postdiction. For prediction and postdiction phases, participants reported with a Yes/No response if they could recall in order a series of images. For the span task, they had to actually recall such series. From 4 years old, children have some ability to monitor their short-term memory and are able to adjust their prediction after experiencing the task. However, accuracy still improves significantly until adolescence. Although the older adults had a lower span, they were as accurate as young adults in their evaluation, suggesting that metamemory is unimpaired for short-term memory tasks in older adults. •We investigate metamemory for short-term memory tasks across the lifespan. •We find younger children cannot accurately predict their span length. •Older adults are accurate in predicting their span length. •People's metamemory accuracy was related to their short-term memory span.

  8. Reliability of Modern Scores to Predict Long-Term Mortality After Isolated Aortic Valve Operations.

    Science.gov (United States)

    Barili, Fabio; Pacini, Davide; D'Ovidio, Mariangela; Ventura, Martina; Alamanni, Francesco; Di Bartolomeo, Roberto; Grossi, Claudio; Davoli, Marina; Fusco, Danilo; Perucci, Carlo; Parolari, Alessandro

    2016-02-01

    Contemporary scores for estimating perioperative death have been proposed to also predict also long-term death. The aim of the study was to evaluate the performance of the updated European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons Predicted Risk of Mortality score, and the Age, Creatinine, Left Ventricular Ejection Fraction score for predicting long-term mortality in a contemporary cohort of isolated aortic valve replacement (AVR). We also sought to develop for each score a simple algorithm based on predicted perioperative risk to predict long-term survival. Complete data on 1,444 patients who underwent isolated AVR in a 7-year period were retrieved from three prospective institutional databases and linked with the Italian Tax Register Information System. Data were evaluated with performance analyses and time-to-event semiparametric regression. Survival was 83.0% ± 1.1% at 5 years and 67.8 ± 1.9% at 8 years. Discrimination and calibration of all three scores both worsened for prediction of death at 1 year and 5 years. Nonetheless, a significant relationship was found between long-term survival and quartiles of scores (p System for Cardiac Operative Risk Evaluation II, 1.34 (95% CI, 1.28 to 1.40) for the Society of Thoracic Surgeons score, and 1.08 (95% CI, 1.06 to 1.10) for the Age, Creatinine, Left Ventricular Ejection Fraction score. The predicted risk generated by European System for Cardiac Operative Risk Evaluation II, The Society of Thoracic Surgeons score, and Age, Creatinine, Left Ventricular Ejection Fraction scores cannot also be considered a direct estimate of the long-term risk for death. Nonetheless, the three scores can be used to derive an estimate of long-term risk of death in patients who undergo isolated AVR with the use of a simple algorithm. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Short-term wind power prediction based on LSSVM–GSA model

    International Nuclear Information System (INIS)

    Yuan, Xiaohui; Chen, Chen; Yuan, Yanbin; Huang, Yuehua; Tan, Qingxiong

    2015-01-01

    Highlights: • A hybrid model is developed for short-term wind power prediction. • The model is based on LSSVM and gravitational search algorithm. • Gravitational search algorithm is used to optimize parameters of LSSVM. • Effect of different kernel function of LSSVM on wind power prediction is discussed. • Comparative studies show that prediction accuracy of wind power is improved. - Abstract: Wind power forecasting can improve the economical and technical integration of wind energy into the existing electricity grid. Due to its intermittency and randomness, it is hard to forecast wind power accurately. For the purpose of utilizing wind power to the utmost extent, it is very important to make an accurate prediction of the output power of a wind farm under the premise of guaranteeing the security and the stability of the operation of the power system. In this paper, a hybrid model (LSSVM–GSA) based on the least squares support vector machine (LSSVM) and gravitational search algorithm (GSA) is proposed to forecast the short-term wind power. As the kernel function and the related parameters of the LSSVM have a great influence on the performance of the prediction model, the paper establishes LSSVM model based on different kernel functions for short-term wind power prediction. And then an optimal kernel function is determined and the parameters of the LSSVM model are optimized by using GSA. Compared with the Back Propagation (BP) neural network and support vector machine (SVM) model, the simulation results show that the hybrid LSSVM–GSA model based on exponential radial basis kernel function and GSA has higher accuracy for short-term wind power prediction. Therefore, the proposed LSSVM–GSA is a better model for short-term wind power prediction

  10. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  11. Earthquake Facts

    Science.gov (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  12. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  13. Electrical streaming potential precursors to catastrophic earthquakes in China

    Directory of Open Access Journals (Sweden)

    F. Qian

    1997-06-01

    Full Text Available The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These transient events triggered by tidal forces exhibited a periodic nature and the statistical phenomenon to migrate towards the epicentre about one month before the earthquake. As a result of events, the pore pressure reached its final equilibrium state and was higher than that in the initial state in a large enough section of the fault region. Consequently, local effective shear strength of the material in the fault zone decreased and finally the catastrophic earthquake was induced. Similar phenomena also occurred one month before the M 7.3 Haichen earthquake of 1975. Therefore, a short term earthquake prediction can be made by electrical measurements, which are the kind of geophysical measurements most closely related to pore fluid behaviors of the deep crust.

  14. Radon anomaly in soil gas as an earthquake precursor

    International Nuclear Information System (INIS)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D.; Planinic, J.

    2008-01-01

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M≥3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T

  15. Radon anomaly in soil gas as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia); Planinic, J. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia)], E-mail: planinic@ffos.hr

    2008-10-15

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M{>=}3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T.

  16. Prediction of Human Phenotype Ontology terms by means of hierarchical ensemble methods.

    Science.gov (United States)

    Notaro, Marco; Schubach, Max; Robinson, Peter N; Valentini, Giorgio

    2017-10-12

    The prediction of human gene-abnormal phenotype associations is a fundamental step toward the discovery of novel genes associated with human disorders, especially when no genes are known to be associated with a specific disease. In this context the Human Phenotype Ontology (HPO) provides a standard categorization of the abnormalities associated with human diseases. While the problem of the prediction of gene-disease associations has been widely investigated, the related problem of gene-phenotypic feature (i.e., HPO term) associations has been largely overlooked, even if for most human genes no HPO term associations are known and despite the increasing application of the HPO to relevant medical problems. Moreover most of the methods proposed in literature are not able to capture the hierarchical relationships between HPO terms, thus resulting in inconsistent and relatively inaccurate predictions. We present two hierarchical ensemble methods that we formally prove to provide biologically consistent predictions according to the hierarchical structure of the HPO. The modular structure of the proposed methods, that consists in a "flat" learning first step and a hierarchical combination of the predictions in the second step, allows the predictions of virtually any flat learning method to be enhanced. The experimental results show that hierarchical ensemble methods are able to predict novel associations between genes and abnormal phenotypes with results that are competitive with state-of-the-art algorithms and with a significant reduction of the computational complexity. Hierarchical ensembles are efficient computational methods that guarantee biologically meaningful predictions that obey the true path rule, and can be used as a tool to improve and make consistent the HPO terms predictions starting from virtually any flat learning method. The implementation of the proposed methods is available as an R package from the CRAN repository.

  17. Long-term orbit prediction for Tiangong-1 spacecraft using the mean atmosphere model

    Science.gov (United States)

    Tang, Jingshi; Liu, Lin; Cheng, Haowen; Hu, Songjie; Duan, Jianfeng

    2015-03-01

    China is planning to complete its first space station by 2020. For the long-term management and maintenance, the orbit of the space station needs to be predicted for a long period of time. Since the space station is expected to work in a low-Earth orbit, the error in the a priori atmosphere model contributes significantly to the rapid increase of the predicted orbit error. When the orbit is predicted for 20 days, the error in the a priori atmosphere model, if not properly corrected, could induce a semi-major axis error of up to a few kilometers and an overall position error of several thousand kilometers respectively. In this work, we use a mean atmosphere model averaged from NRLMSISE00. The a priori reference mean density can be corrected during the orbit determination. For the long-term orbit prediction, we use sufficiently long period of observations and obtain a series of the diurnal mean densities. This series contains the recent variation of the atmosphere density and can be analyzed for various periodic components. After being properly fitted, the mean density can be predicted and then applied in the orbit prediction. Here we carry out the test with China's Tiangong-1 spacecraft at the altitude of about 340 km and we show that this method is simple and flexible. The densities predicted with this approach can serve in the long-term orbit prediction. In several 20-day prediction tests, most predicted orbits show semi-major axis errors better than 700 m and overall position errors better than 400 km.

  18. Stochastic Short-term High-resolution Prediction of Solar Irradiance and Photovoltaic Power Output

    Energy Technology Data Exchange (ETDEWEB)

    Melin, Alexander M. [ORNL; Olama, Mohammed M. [ORNL; Dong, Jin [ORNL; Djouadi, Seddik M. [ORNL; Zhang, Yichen [University of Tennessee, Knoxville (UTK), Department of Electrical Engineering and Computer Science

    2017-09-01

    The increased penetration of solar photovoltaic (PV) energy sources into electric grids has increased the need for accurate modeling and prediction of solar irradiance and power production. Existing modeling and prediction techniques focus on long-term low-resolution prediction over minutes to years. This paper examines the stochastic modeling and short-term high-resolution prediction of solar irradiance and PV power output. We propose a stochastic state-space model to characterize the behaviors of solar irradiance and PV power output. This prediction model is suitable for the development of optimal power controllers for PV sources. A filter-based expectation-maximization and Kalman filtering mechanism is employed to estimate the parameters and states in the state-space model. The mechanism results in a finite dimensional filter which only uses the first and second order statistics. The structure of the scheme contributes to a direct prediction of the solar irradiance and PV power output without any linearization process or simplifying assumptions of the signal’s model. This enables the system to accurately predict small as well as large fluctuations of the solar signals. The mechanism is recursive allowing the solar irradiance and PV power to be predicted online from measurements. The mechanism is tested using solar irradiance and PV power measurement data collected locally in our lab.

  19. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.

    2005-01-01

    Short-term wind power prediction is a primary requirement for efficient large-scale integration of wind generation in power systems and electricity markets. The choice of an appropriate prediction model among the numerous available models is not trivial, and has to be based on an objective...... evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  20. Development of an integrated method for long-term water quality prediction using seasonal climate forecast

    Directory of Open Access Journals (Sweden)

    J. Cho

    2016-10-01

    Full Text Available The APEC Climate Center (APCC produces climate prediction information utilizing a multi-climate model ensemble (MME technique. In this study, four different downscaling methods, in accordance with the degree of utilizing the seasonal climate prediction information, were developed in order to improve predictability and to refine the spatial scale. These methods include: (1 the Simple Bias Correction (SBC method, which directly uses APCC's dynamic prediction data with a 3 to 6 month lead time; (2 the Moving Window Regression (MWR method, which indirectly utilizes dynamic prediction data; (3 the Climate Index Regression (CIR method, which predominantly uses observation-based climate indices; and (4 the Integrated Time Regression (ITR method, which uses predictors selected from both CIR and MWR. Then, a sampling-based temporal downscaling was conducted using the Mahalanobis distance method in order to create daily weather inputs to the Soil and Water Assessment Tool (SWAT model. Long-term predictability of water quality within the Wecheon watershed of the Nakdong River Basin was evaluated. According to the Korean Ministry of Environment's Provisions of Water Quality Prediction and Response Measures, modeling-based predictability was evaluated by using 3-month lead prediction data issued in February, May, August, and November as model input of SWAT. Finally, an integrated approach, which takes into account various climate information and downscaling methods for water quality prediction, was presented. This integrated approach can be used to prevent potential problems caused by extreme climate in advance.

  1. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  2. Strong ground motion of the 2016 Kumamoto earthquake

    Science.gov (United States)

    Aoi, S.; Kunugi, T.; Suzuki, W.; Kubo, H.; Morikawa, N.; Fujiwara, H.

    2016-12-01

    The 2016 Kumamoto earthquake that is composed of Mw 6.1 and Mw 7.1 earthquakes respectively occurred in the Kumamoto region at 21:26 on April 14 and 28 hours later at 1:25 on April 16, 2016 (JST). These earthquakes are considered to rupture mainly the Hinagu fault zone for the Mw 6.1 event and the Futagawa fault zone for the Mw 7.1 event, respectively, where the Headquarter for Earthquake Research Promotion performed the long-term evaluation as well as seismic hazard assessment prior to the 2016 Kumamoto earthquake. Strong shakings with seismic intensity 7 in the JMA scale were observed at four times in total: Mashiki town for the Mw 6.1 and Mw 7.1 events, Nishihara village for the Mw 7.1 event, and NIED/KiK-net Mashiki (KMMH16) for the Mw 7.1 event. KiK-net Mashiki (KMMH16) recorded peak ground acceleration more than 1000 cm/s/s, and Nishihara village recorded peak ground velocity more than 250 cm/s. Ground motions were observed wider area for the Mw 7.1 event than the Mw 6.1 event. Peak ground accelerations and peak ground velocities of K-NET/KiK-net stations are consistent with the ground motion prediction equations by Si and Midorikawa (1999). Peak ground velocities at longer distance than 200 km attenuate slowly, which can be attributed to the large Love wave with a dominant period around 10 seconds. 5%-damped pseudo spectral velocity of the Mashiki town shows a peak at period of 1-2 s that exceeds ground motion response of JR Takatori of the 1995 Kobe earthquake and the Kawaguchi town of the 2004 Chuetsu earthquake. 5%-damped pseudo spectral velocity of the Nishihara village shows 350 cm/s peak at period of 3-4 s that is similar to the several stations in Kathmandu basin by Takai et al. (2016) during the 2015 Gorkha earthquake in Nepal. Ground motions at several stations in Oita exceed the ground motion prediction equations due to an earthquake induced by the Mw 7.1 event. Peak ground accelerations of K-NET Yufuin (OIT009) records 90 cm/s/s for the Mw 7

  3. Predicting long-term temperature increase for time-dependent SAR levels with a single short-term temperature response.

    Science.gov (United States)

    Carluccio, Giuseppe; Bruno, Mary; Collins, Christopher M

    2016-05-01

    Present a novel method for rapid prediction of temperature in vivo for a series of pulse sequences with differing levels and distributions of specific energy absorption rate (SAR). After the temperature response to a brief period of heating is characterized, a rapid estimate of temperature during a series of periods at different heating levels is made using a linear heat equation and impulse-response (IR) concepts. Here the initial characterization and long-term prediction for a complete spine exam are made with the Pennes' bioheat equation where, at first, core body temperature is allowed to increase and local perfusion is not. Then corrections through time allowing variation in local perfusion are introduced. The fast IR-based method predicted maximum temperature increase within 1% of that with a full finite difference simulation, but required less than 3.5% of the computation time. Even higher accelerations are possible depending on the time step size chosen, with loss in temporal resolution. Correction for temperature-dependent perfusion requires negligible additional time and can be adjusted to be more or less conservative than the corresponding finite difference simulation. With appropriate methods, it is possible to rapidly predict temperature increase throughout the body for actual MR examinations. © 2015 Wiley Periodicals, Inc.

  4. A review on the young history of the wind power short-term prediction

    DEFF Research Database (Denmark)

    Costa, A.; Crespo, A.; Navarro, J.

    2008-01-01

    This paper makes a brief review on 30 years of history of the wind power short-term prediction, since the first ideas and sketches on the theme to the actual state of the art oil models and tools, giving emphasis to the most significant proposals and developments. The two principal lines of thought...... on short-term prediction (mathematical and physical) are indistinctly treated here and comparisons between models and tools are avoided, mainly because, on the one hand, a standard for a measure of performance is still not adopted and, on the other hand, it is very important that the data are exactly...

  5. Short-Term Wind Speed Prediction Using EEMD-LSSVM Model

    Directory of Open Access Journals (Sweden)

    Aiqing Kang

    2017-01-01

    Full Text Available Hybrid Ensemble Empirical Mode Decomposition (EEMD and Least Square Support Vector Machine (LSSVM is proposed to improve short-term wind speed forecasting precision. The EEMD is firstly utilized to decompose the original wind speed time series into a set of subseries. Then the LSSVM models are established to forecast these subseries. Partial autocorrelation function is adopted to analyze the inner relationships between the historical wind speed series in order to determine input variables of LSSVM models for prediction of every subseries. Finally, the superposition principle is employed to sum the predicted values of every subseries as the final wind speed prediction. The performance of hybrid model is evaluated based on six metrics. Compared with LSSVM, Back Propagation Neural Networks (BP, Auto-Regressive Integrated Moving Average (ARIMA, combination of Empirical Mode Decomposition (EMD with LSSVM, and hybrid EEMD with ARIMA models, the wind speed forecasting results show that the proposed hybrid model outperforms these models in terms of six metrics. Furthermore, the scatter diagrams of predicted versus actual wind speed and histograms of prediction errors are presented to verify the superiority of the hybrid model in short-term wind speed prediction.

  6. Remaining uncertainties in predicting long-term performance of nuclear waste glass from experiments

    International Nuclear Information System (INIS)

    Grambow, B.

    1994-01-01

    The current knowledge on the glass dissolution mechanism and the representation of glass dissolution concepts within overall repository performance assessment models are briefly summarized and uncertainties related to mechanism, radionuclide chemistry and parameters are discussed. Understanding of the major glass dissolution processes has been significantly increased in recent years. Long-term glass stability is related to the long-term maintenance of silica saturated conditions. The behavior of individual radionuclides in the presence of a dissolving glass has not been sufficiently and results do no yet allow meaningful predictions. Conserving long-term predictions of glass matrix dissolution as upper limit for radionuclide release can be made with sufficient confidence, however these estimations generally result in a situation where the barrier function of the glass is masked by the efficiency of the geologic barrier. Realistic long-term predictions may show that the borosilicate waste glass contributes to overall repository safety to a much larger extent than indicated by overconservatism. Today realistic predictions remain highly uncertain and much more research work is necessary. In particular, the long-term rate under silica saturated conditions needs to be understood and the behavior of individual radionuclides in the presence of a dissolving glass deserves more systematic investigations

  7. Predicting short term mood developments among depressed patients using adherence and ecological momentary assessment data

    Directory of Open Access Journals (Sweden)

    Adam Mikus

    2018-06-01

    Full Text Available Technology driven interventions provide us with an increasing amount of fine-grained data about the patient. This data includes regular ecological momentary assessments (EMA but also response times to EMA questions by a user. When observing this data, we see a huge variation between the patterns exhibited by different patients. Some are more stable while others vary a lot over time. This poses a challenging problem for the domain of artificial intelligence and makes on wondering whether it is possible to predict the future mental state of a patient using the data that is available. In the end, these predictions could potentially contribute to interventions that tailor the feedback to the user on a daily basis, for example by warning a user that a fall-back might be expected during the next days, or by applying a strategy to prevent the fall-back from occurring in the first place.In this work, we focus on short term mood prediction by considering the adherence and usage data as an additional predictor. We apply recurrent neural networks to handle the temporal aspects best and try to explore whether individual, group level, or one single predictive model provides the highest predictive performance (measured using the root mean squared error (RMSE. We use data collected from patients from five countries who used the ICT4Depression/MoodBuster platform in the context of the EU E-COMPARED project. In total, we used the data from 143 patients (with between 9 and 425days of EMA data who were diagnosed with a major depressive disorder according to DSM-IV.Results show that we can make predictions of short term mood change quite accurate (ranging between 0.065 and 0.11. The past EMA mood ratings proved to be the most influential while adherence and usage data did not improve prediction accuracy. In general, group level predictions proved to be the most promising, however differences were not significant.Short term mood prediction remains a difficult task

  8. The Impact of EuroSCORE II Risk Factors on Prediction of Long-Term Mortality.

    Science.gov (United States)

    Barili, Fabio; Pacini, Davide; D'Ovidio, Mariangela; Dang, Nicholas C; Alamanni, Francesco; Di Bartolomeo, Roberto; Grossi, Claudio; Davoli, Marina; Fusco, Danilo; Parolari, Alessandro

    2016-10-01

    The European System for Cardiac Operation Risk Evaluation (EuroSCORE) II has not been tested yet for predicting long-term mortality. This study was undertaken to evaluate the relationship between EuroSCORE II and long-term mortality and to develop a new algorithm based on EuroSCORE II factors to predict long-term survival after cardiac surgery. Complete data on 10,033 patients who underwent major cardiac surgery during a 7-year period were retrieved from three prospective institutional databases and linked with the Italian Tax Register Information System. Mortality at follow-up was analyzed with time-to-event analysis. The Kaplan-Meier estimates of survival at 1 and 5 were, respectively, 95.0% ± 0.2% and 84.7% ± 0.4%. Both discrimination and calibration of EuroSCORE II decreased in the prediction of 1-year and 5-year mortality. Nonetheless, EuroSCORE II was confirmed to be an independent predictor of long-term mortality with a nonlinear trend. Several EuroSCORE II variables were independent risk factors for long-term mortality in a regression model, most of all very low ejection fraction (less than 20%), salvage operation, and dialysis. In the final model, isolated mitral valve surgery and isolated coronary artery bypass graft surgery were associated with improved long-term survival. The EuroSCORE II cannot be considered a direct estimator of long-term risk of death, as its performance fades for mortality at follow-up longer than 30 days. Nonetheless, it is nonlinearly associated with long-term mortality, and most of its variables are risk factors for long-term mortality. Hence, they can be used in a different algorithm to stratify the risk of long-term mortality after surgery. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  9. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  10. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2018-02-01

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  12. Long-Term Monitoring of the Impacts of Disaster on Human Activity Using DMSP/OLS Nighttime Light Data: A Case Study of the 2008 Wenchuan, China Earthquake

    Directory of Open Access Journals (Sweden)

    Xue Li

    2018-04-01

    Full Text Available Time series monitoring of earthquake-stricken areas is significant in evaluating post-disaster reconstruction and recovery. The time series of nighttime light (NTL data collected by the defense meteorological satellite program-operational linescan system (DMSP/OLS sensors provides a unique and valuable resource to study changes in human activity (HA because of the long period of available data. In this paper, the DMSP/OLS NTL images’ digital number (DN is used as a proxy for the intensity of HA since there is a high correlation between them. The purpose of this study is to develop a methodology to analyze the changes of intensity and distribution of HA in different areas affected by a 2008 earthquake in Wenchuan, China. In order to compare the trends of HA before and after the earthquake, the DMSP/OLS NTL images from 2003 to 2013 were processed and analyzed. However, their analysis capability is greatly limited owing to a lack of in-flight calibration. To improve the continuity and comparability of DMSP/OLS NTL images, this study developed an automatic intercalibration method to systematically correct NTL data. The results reveal that: (1 compared with the HA before the earthquake, the reconstruction and recovery of the Wenchuan earthquake have led to a significant increase of HA in earthquake-stricken areas within three years after the earthquake; (2 the fluctuation of HA in a severely-affected area is greater than that in a less-affected area; (3 recovery efforts increase development in the most affected areas to levels that exceeded the rates in similar areas which experienced less damage; and (4 areas alongside roads and close to reconstruction projects exhibited increased development in regions with otherwise low human activity.

  13. The limits of earthquake early warning: Timeliness of ground motion estimates

    OpenAIRE

    Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.

    2018-01-01

    The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions aroun...

  14. GIS BASED SYSTEM FOR POST-EARTHQUAKE CRISIS MANAGMENT USING CELLULAR NETWORK

    OpenAIRE

    Raeesi, M.; Sadeghi-Niaraki, A.

    2013-01-01

    Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the d...

  15. Long-term orbit prediction for China's Tiangong-1 spacecraft based on mean atmosphere model

    Science.gov (United States)

    Tang, Jingshi; Liu, Lin; Miao, Manqian

    Tiangong-1 is China's test module for future space station. It has gone through three successful rendezvous and dockings with Shenzhou spacecrafts from 2011 to 2013. For the long-term management and maintenance, the orbit sometimes needs to be predicted for a long period of time. As Tiangong-1 works in a low-Earth orbit with an altitude of about 300-400 km, the error in the a priori atmosphere model contributes significantly to the rapid increase of the predicted orbit error. When the orbit is predicted for 10-20 days, the error in the a priori atmosphere model, if not properly corrected, could induce the semi-major axis error and the overall position error up to a few kilometers and several thousand kilometers respectively. In this work, we use a mean atmosphere model averaged from NRLMSIS00. The a priori reference mean density can be corrected during precise orbit determination (POD). For applications in the long-term orbit prediction, the observations are first accumulated. With sufficiently long period of observations, we are able to obtain a series of the diurnal mean densities. This series bears the recent variation of the atmosphere density and can be analyzed for various periods. After being properly fitted, the mean density can be predicted and then applied in the orbit prediction. We show that the densities predicted with this approach can serve to increase the accuracy of the predicted orbit. In several 20-day prediction tests, most predicted orbits show semi-major axis errors better than 700m and overall position errors better than 600km.

  16. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  17. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  18. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  19. Swarm Intelligence-Based Hybrid Models for Short-Term Power Load Prediction

    Directory of Open Access Journals (Sweden)

    Jianzhou Wang

    2014-01-01

    Full Text Available Swarm intelligence (SI is widely and successfully applied in the engineering field to solve practical optimization problems because various hybrid models, which are based on the SI algorithm and statistical models, are developed to further improve the predictive abilities. In this paper, hybrid intelligent forecasting models based on the cuckoo search (CS as well as the singular spectrum analysis (SSA, time series, and machine learning methods are proposed to conduct short-term power load prediction. The forecasting performance of the proposed models is augmented by a rolling multistep strategy over the prediction horizon. The test results are representative of the out-performance of the SSA and CS in tuning the seasonal autoregressive integrated moving average (SARIMA and support vector regression (SVR in improving load forecasting, which indicates that both the SSA-based data denoising and SI-based intelligent optimization strategy can effectively improve the model’s predictive performance. Additionally, the proposed CS-SSA-SARIMA and CS-SSA-SVR models provide very impressive forecasting results, demonstrating their strong robustness and universal forecasting capacities in terms of short-term power load prediction 24 hours in advance.

  20. Implicit attitudes towards smoking predict long-term relapse in abstinent smokers

    NARCIS (Netherlands)

    Spruyt, A.; Lemaigre, V.; Salhi, B.; van Gucht, D.; Tibboel, H.; van Bockstaele, B.; de Houwer, J.; van Meerbeeck, J.; Nackaerts, K.

    2015-01-01

    Rationale: It has previously been argued that implicit attitudes toward substance-related cues drive addictive behavior. Nevertheless, it remains an open question whether behavioral markers of implicit attitude activation can be used to predict long-term relapse. Objectives: The main objective of

  1. Application of Grey Model GM(1, 1) to Ultra Short-Term Predictions of Universal Time

    Science.gov (United States)

    Lei, Yu; Guo, Min; Zhao, Danning; Cai, Hongbing; Hu, Dandan

    2016-03-01

    A mathematical model known as one-order one-variable grey differential equation model GM(1, 1) has been herein employed successfully for the ultra short-term (advantage is that the developed method is easy to use. All these reveal a great potential of the GM(1, 1) model for UT1-UTC predictions.

  2. Using Forecasting to Predict Long-Term Resource Utilization for Web Services

    Science.gov (United States)

    Yoas, Daniel W.

    2013-01-01

    Researchers have spent years understanding resource utilization to improve scheduling, load balancing, and system management through short-term prediction of resource utilization. Early research focused primarily on single operating systems; later, interest shifted to distributed systems and, finally, into web services. In each case researchers…

  3. Overview, comparative assessment and recommendations of forecasting models for short-term water demand prediction

    CSIR Research Space (South Africa)

    Anele, AO

    2017-11-01

    Full Text Available -term water demand (STWD) forecasts. In view of this, an overview of forecasting methods for STWD prediction is presented. Based on that, a comparative assessment of the performance of alternative forecasting models from the different methods is studied. Times...

  4. Analysts forecast error : A robust prediction model and its short term trading

    NARCIS (Netherlands)

    Boudt, Kris; de Goeij, Peter; Thewissen, James; Van Campenhout, Geert

    We examine the profitability of implementing a short term trading strategy based on predicting the error in analysts' earnings per share forecasts using publicly available information. Since large earnings surprises may lead to extreme values in the forecast error series that disrupt their smooth

  5. Serum YKL-40 predicts long-term mortality in patients with stable coronary disease

    DEFF Research Database (Denmark)

    Harutyunyan, Marina; Gøtze, Jens P; Winkel, Per

    2013-01-01

    We investigated whether the inflammatory biomarker YKL-40 could improve the long-term prediction of death made by common risk factors plus high-sensitivity C-reactive protein (hs-CRP) and N-terminal-pro-B natriuretic peptide (NT-proBNP) in patients with stable coronary artery disease (CAD)....

  6. Early Posttransplant Tryptophan Metabolism Predicts Long-term Outcome of Human Kidney Transplantation

    NARCIS (Netherlands)

    Vavrincova-Yaghi, Diana; Seelen, Marc A.; Kema, Ido P.; Deelman, Leo E.; Heuvel, van den Marius; Breukelman, Henk; Van den Eynde, Benoit J.; Henning, Rob H.; van Goor, Harry; Sandovici, Maria

    Background. Chronic transplant dysfunction (CTD) is the leading cause of long-term loss of the renal allograft. So far, no single test is available to reliably predict the risk for CTD. Monitoring of tryptophan (trp) metabolism through indoleamine 2.3-dioxygenase (IDO) has been previously proposed

  7. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  8. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  9. PHENOstruct: Prediction of human phenotype ontology terms using heterogeneous data sources.

    Science.gov (United States)

    Kahanda, Indika; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa

    2015-01-01

    The human phenotype ontology (HPO) was recently developed as a standardized vocabulary for describing the phenotype abnormalities associated with human diseases. At present, only a small fraction of human protein coding genes have HPO annotations. But, researchers believe that a large portion of currently unannotated genes are related to disease phenotypes. Therefore, it is important to predict gene-HPO term associations using accurate computational methods. In this work we demonstrate the performance advantage of the structured SVM approach which was shown to be highly effective for Gene Ontology term prediction in comparison to several baseline methods. Furthermore, we highlight a collection of informative data sources suitable for the problem of predicting gene-HPO associations, including large scale literature mining data.

  10. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  11. Bayesian Methods for Predicting the Shape of Chinese Yam in Terms of Key Diameters

    Directory of Open Access Journals (Sweden)

    Mitsunori Kayano

    2017-01-01

    Full Text Available This paper proposes Bayesian methods for the shape estimation of Chinese yam (Dioscorea opposita using a few key diameters of yam. Shape prediction of yam is applicable to determining optimal cutoff positions of a yam for producing seed yams. Our Bayesian method, which is a combination of Bayesian estimation model and predictive model, enables automatic, rapid, and low-cost processing of yam. After the construction of the proposed models using a sample data set in Japan, the models provide whole shape prediction of yam based on only a few key diameters. The Bayesian method performed well on the shape prediction in terms of minimizing the mean squared error between measured shape and the prediction. In particular, a multiple regression method with key diameters at two fixed positions attained the highest performance for shape prediction. We have developed automatic, rapid, and low-cost yam-processing machines based on the Bayesian estimation model and predictive model. Development of such shape prediction approaches, including our Bayesian method, can be a valuable aid in reducing the cost and time in food processing.

  12. Short Term Prediction of PM10 Concentrations Using Seasonal Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Hamid Hazrul Abdul

    2016-01-01

    Full Text Available Air pollution modelling is one of an important tool that usually used to make short term and long term prediction. Since air pollution gives a big impact especially to human health, prediction of air pollutants concentration is needed to help the local authorities to give an early warning to people who are in risk of acute and chronic health effects from air pollution. Finding the best time series model would allow prediction to be made accurately. This research was carried out to find the best time series model to predict the PM10 concentrations in Nilai, Negeri Sembilan, Malaysia. By considering two seasons which is wet season (north east monsoon and dry season (south west monsoon, seasonal autoregressive integrated moving average model were used to find the most suitable model to predict the PM10 concentrations in Nilai, Negeri Sembilan by using three error measures. Based on AIC statistics, results show that ARIMA (1, 1, 1 × (1, 0, 012 is the most suitable model to predict PM10 concentrations in Nilai, Negeri Sembilan.

  13. MultiLoc2: integrating phylogeny and Gene Ontology terms improves subcellular protein localization prediction

    Directory of Open Access Journals (Sweden)

    Kohlbacher Oliver

    2009-09-01

    Full Text Available Abstract Background Knowledge of subcellular localization of proteins is crucial to proteomics, drug target discovery and systems biology since localization and biological function are highly correlated. In recent years, numerous computational prediction methods have been developed. Nevertheless, there is still a need for prediction methods that show more robustness and higher accuracy. Results We extended our previous MultiLoc predictor by incorporating phylogenetic profiles and Gene Ontology terms. Two different datasets were used for training the system, resulting in two versions of this high-accuracy prediction method. One version is specialized for globular proteins and predicts up to five localizations, whereas a second version covers all eleven main eukaryotic subcellular localizations. In a benchmark study with five localizations, MultiLoc2 performs considerably better than other methods for animal and plant proteins and comparably for fungal proteins. Furthermore, MultiLoc2 performs clearly better when using a second dataset that extends the benchmark study to all eleven main eukaryotic subcellular localizations. Conclusion MultiLoc2 is an extensive high-performance subcellular protein localization prediction system. By incorporating phylogenetic profiles and Gene Ontology terms MultiLoc2 yields higher accuracies compared to its previous version. Moreover, it outperforms other prediction systems in two benchmarks studies. MultiLoc2 is available as user-friendly and free web-service, available at: http://www-bs.informatik.uni-tuebingen.de/Services/MultiLoc2.

  14. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  15. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  16. Predicting long-term graft survival in adult kidney transplant recipients

    Directory of Open Access Journals (Sweden)

    Brett W Pinsky

    2012-01-01

    Full Text Available The ability to accurately predict a population′s long-term survival has important implications for quantifying the benefits of transplantation. To identify a model that can accurately predict a kidney transplant population′s long-term graft survival, we retrospectively studied the United Network of Organ Sharing data from 13,111 kidney-only transplants completed in 1988- 1989. Nineteen-year death-censored graft survival (DCGS projections were calculated and com-pared with the population′s actual graft survival. The projection curves were created using a two-part estimation model that (1 fits a Kaplan-Meier survival curve immediately after transplant (Part A and (2 uses truncated observational data to model a survival function for long-term projection (Part B. Projection curves were examined using varying amounts of time to fit both parts of the model. The accuracy of the projection curve was determined by examining whether predicted sur-vival fell within the 95% confidence interval for the 19-year Kaplan-Meier survival, and the sample size needed to detect the difference in projected versus observed survival in a clinical trial. The 19-year DCGS was 40.7% (39.8-41.6%. Excellent predictability (41.3% can be achieved when Part A is fit for three years and Part B is projected using two additional years of data. Using less than five total years of data tended to overestimate the population′s long-term survival, accurate prediction of long-term DCGS is possible, but requires attention to the quantity data used in the projection method.

  17. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  18. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  19. Long-term Creep Life Prediction for Type 316LN Stainless Steel

    International Nuclear Information System (INIS)

    Kim, Woo Gon; Ryu, Woo Seog; Kim, Sung Ho; Lee, Chan Bok

    2007-01-01

    Since Sodium Fast Cooled Reactor (SFR) components are designed to be use for more than 30 years at a high temperature of 550 .deg. C, one of the most important properties of these components is the long term creep behavior. To accurately predict the long-term creep life of the components, it is essential to achieve reliable long-term test data beyond their design life. But, it is difficult to actually obtain long duration data because it is time-consuming work. So far, a variety of time-temperature parameters (TTPs) have been developed to predict a long-term creep life from shorter-time tests at higher temperatures. Among them, the Larson-Miller, the Orr-Sherby-Dorn, the Manson-Harferd and the Manson-Succop parameters have been typically used. None of these parameters has an overwhelming preference, and they have certain inherent restrictions imposed on their data in the application of the TTPs parameters. Meanwhile, it has been reported that the Minimum Commitment Method (MCM) proposed by Manson and Ensign has a greater flexibility for a creep rupture analysis. Thus, the MCM will be useful as another approach. Until now, the applicability of the MCM has not been investigated for type 316LN SS because of insufficient creep data. In this paper, the MCM was applied to predict a long-term creep life of type 316LN stainless steel (SS). Lots of creep rupture data was collected through literature surveys and the experimental data of KAERI. Using the short-term experimental data for under 2,000 hours, a longer-time rupture above 105 hours was predicted by the MCM at temperatures from 550 .deg. C to 800 .deg. C

  20. An information infrastructure for earthquake science

    Science.gov (United States)

    Jordan, T. H.; Scec/Itr Collaboration

    2003-04-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute,IRIS, and the USGS, has received a large five-year grant from the NSF's ITR Program and its Geosciences Directorate to build a new information infrastructure for earthquake science. In many respects, the SCEC/ITR Project presents a microcosm of the IT efforts now being organized across the geoscience community, including the EarthScope initiative. The purpose of this presentation is to discuss the experience gained by the project thus far and lay out the challenges that lie ahead; our hope is to encourage cross-discipline collaboration in future IT advancements. Project goals have been formulated in terms of four "computational pathways" related to seismic hazard analysis (SHA). For example, Pathway 1 involves the construction of an open-source, object-oriented, and web-enabled framework for SHA computations that can incorporate a variety of earthquake forecast models, intensity-measure relationships, and site-response models, while Pathway 2 aims to utilize the predictive power of wavefield simulation in modeling time-dependent ground motion for scenario earthquakes and constructing intensity-measure relationships. The overall goal is to create a SCEC "community modeling environment" or collaboratory that will comprise the curated (on-line, documented, maintained) resources needed by researchers to develop and use these four computational pathways. Current activities include (1) the development and verification of the computational modules, (2) the standardization of data structures and interfaces needed for syntactic interoperability, (3) the development of knowledge representation and management tools, (4) the construction SCEC computational and data grid testbeds, and (5) the creation of user interfaces for knowledge-acquisition, code execution, and visualization. I will emphasize the increasing role of standardized

  1. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  2. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  3. Long-term responses of sandy beach crustaceans to the effects of coastal armouring after the 2010 Maule earthquake in South Central Chile

    Science.gov (United States)

    Rodil, Iván F.; Jaramillo, Eduardo; Acuña, Emilio; Manzano, Mario; Velasquez, Carlos

    2016-02-01

    Earthquakes and tsunamis are large physical disturbances frequently striking the coast of Chile with dramatic effects on intertidal habitats. Armouring structures built as societal responses to beach erosion and shoreline retreat are also responsible of coastal squeeze and habitat loss. The ecological implications of interactions between coastal armouring and earthquakes have recently started to be studied for beach ecosystems. How long interactive impacts persist is still unclear because monitoring after disturbance generally extends for a few months. During five years after the Maule earthquake (South Central Chile, February 27th 2010) we monitored the variability in population abundances of the most common crustacean inhabitants of different beach zones (i.e. upper, medium, and lower intertidal) at two armoured (one concrete seawall and one rocky revetment) and one unarmoured sites along the sandy beach of Llico. Beach morphology changed after the earthquake-mediated uplift, restoring upper- and mid-shore armoured levels that were rapidly colonized by typical crustacean species. However, post-earthquake increasing human activities affected the colonization process of sandy beach crustaceans in front of the seawall. Lower-shore crab Emerita analoga was the less affected by armouring structures, and it was the only crustacean species present at the three sites before and after the earthquake. This study shows that field sampling carried out promptly after major disturbances, and monitoring of the affected sites long after the disturbance is gone are effective approaches to increase the knowledge on the interactive effects of large-scale natural phenomena and artificial defences on beach ecology.

  4. Predictive Validity of the Columbia-Suicide Severity Rating Scale for Short-Term Suicidal Behavior

    DEFF Research Database (Denmark)

    Conway, Paul Maurice; Erlangsen, Annette; Teasdale, Thomas William

    2017-01-01

    adolescents (90.6% females) who participated at follow-up (85.9%) out of the 99 (49.7%) baseline respondents. All adolescents were recruited from a specialized suicide-prevention clinic in Denmark. Through multivariate logistic regression analyses, we examined whether baseline suicidal behavior predicted......Using the Columbia-Suicide Severity Rating Scale (C-SSRS), we examined the predictive and incremental predictive validity of past-month suicidal behavior and ideation for short-term suicidal behavior among adolescents at high risk of suicide. The study was conducted in 2014 on a sample of 85...... subsequent suicidal behavior (actual attempts and suicidal behavior of any type, including preparatory acts, aborted, interrupted and actual attempts; mean follow-up of 80.8 days, SD = 52.4). Furthermore, we examined whether suicidal ideation severity and intensity incrementally predicted suicidal behavior...

  5. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  6. Stochastic model stationarization by eliminating the periodic term and its effect on time series prediction

    Science.gov (United States)

    Moeeni, Hamid; Bonakdari, Hossein; Fatemi, Seyed Ehsan

    2017-04-01

    Because time series stationarization has a key role in stochastic modeling results, three methods are analyzed in this study. The methods are seasonal differencing, seasonal standardization and spectral analysis to eliminate the periodic effect on time series stationarity. First, six time series including 4 streamflow series and 2 water temperature series are stationarized. The stochastic term for these series obtained with ARIMA is subsequently modeled. For the analysis, 9228 models are introduced. It is observed that seasonal standardization and spectral analysis eliminate the periodic term completely, while seasonal differencing maintains seasonal correlation structures. The obtained results indicate that all three methods present acceptable performance overall. However, model accuracy in monthly streamflow prediction is higher with seasonal differencing than with the other two methods. Another advantage of seasonal differencing over the other methods is that the monthly streamflow is never estimated as negative. Standardization is the best method for predicting monthly water temperature although it is quite similar to seasonal differencing, while spectral analysis performed the weakest in all cases. It is concluded that for each monthly seasonal series, seasonal differencing is the best stationarization method in terms of periodic effect elimination. Moreover, the monthly water temperature is predicted with more accuracy than monthly streamflow. The criteria of the average stochastic term divided by the amplitude of the periodic term obtained for monthly streamflow and monthly water temperature were 0.19 and 0.30, 0.21 and 0.13, and 0.07 and 0.04 respectively. As a result, the periodic term is more dominant than the stochastic term for water temperature in the monthly water temperature series compared to streamflow series.

  7. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  8. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    Science.gov (United States)

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.

  9. Hepatic Venous Pressure Gradient Predicts Long-Term Mortality in Patients with Decompensated Cirrhosis

    Science.gov (United States)

    Kim, Tae Yeob; Lee, Jae Gon; Kim, Ji Yeoun; Kim, Sun Min; Kim, Jinoo; Jeong, Woo Kyoung

    2016-01-01

    Purpose The present study aimed to investigate the role of hepatic venous pressure gradient (HVPG) for prediction of long-term mortality in patients with decompensated cirrhosis. Materials and Methods Clinical data from 97 non-critically-ill cirrhotic patients with HVPG measurements were retrospectively and consecutively collected between 2009 and 2012. Patients were classified according to clinical stages and presence of ascites. The prognostic accuracy of HVPG for death, survival curves, and hazard ratios were analyzed. Results During a median follow-up of 24 (interquartile range, 13-36) months, 22 patients (22.7%) died. The area under the receiver operating characteristics curves of HVPG for predicting 1-year, 2-year, and overall mortality were 0.801, 0.737, and 0.687, respectively (all p17 mm Hg, respectively (p=0.015). In the ascites group, the mortality rates at 1 and 2 years were 3.9% and 17.6% with HVPG ≤17 mm Hg and 17.5% and 35.2% with HVPG >17 mm Hg, respectively (p=0.044). Regarding the risk factors for mortality, both HVPG and model for end-stage liver disease were positively related with long-term mortality in all patients. Particularly, for the patients with ascites, both prothrombin time and HVPG were independent risk factors for predicting poor outcomes. Conclusion HVPG is useful for predicting the long-term mortality in patients with decompensated cirrhosis, especially in the presence of ascites. PMID:26632394

  10. Greek long-term energy consumption prediction using artificial neural networks

    International Nuclear Information System (INIS)

    Ekonomou, L.

    2010-01-01

    In this paper artificial neural networks (ANN) are addressed in order the Greek long-term energy consumption to be predicted. The multilayer perceptron model (MLP) has been used for this purpose by testing several possible architectures in order to be selected the one with the best generalizing ability. Actual recorded input and output data that influence long-term energy consumption were used in the training, validation and testing process. The developed ANN model is used for the prediction of 2005-2008, 2010, 2012 and 2015 Greek energy consumption. The produced ANN results for years 2005-2008 were compared with the results produced by a linear regression method, a support vector machine method and with real energy consumption records showing a great accuracy. The proposed approach can be useful in the effective implementation of energy policies, since accurate predictions of energy consumption affect the capital investment, the environmental quality, the revenue analysis, the market research management, while conserve at the same time the supply security. Furthermore it constitutes an accurate tool for the Greek long-term energy consumption prediction problem, which up today has not been faced effectively.

  11. Long-Term Prediction of Severe Hypoglycemia in Type 1 Diabetes

    DEFF Research Database (Denmark)

    Henriksen, Marie Moth; Færch, Louise; Thorsteinsson, Birger

    2016-01-01

    BACKGROUND: Prediction of risk of severe hypoglycemia (SH) in patients with type 1 diabetes is important to prevent future episodes, but it is unknown if it is possible to predict the long-term risk of SH. The aim of the study is to assess if long-term prediction of SH is possible in type 1...... diabetes. METHODS: A follow-up study was performed with 98 patients with type 1 diabetes. At baseline and at follow-up, the patients filled in a questionnaire about diabetes history and complications, number of SH in the preceding year and state of awareness, and HbA1c and C-peptide levels were measured......-up. CONCLUSIONS: Long-term prediction of severe hypoglycemia in type 1 diabetes was not possible, although baseline hypoglycemia unawareness tended to remain a predictor for risk of SH at follow-up. Therefore, it is important repeatedly to assess the different risk factors of SH to determine the actual risk....

  12. Molecular constraints on synaptic tagging and maintenance of long-term potentiation: a predictive model.

    Directory of Open Access Journals (Sweden)

    Paul Smolen

    Full Text Available Protein synthesis-dependent, late long-term potentiation (LTP and depression (LTD at glutamatergic hippocampal synapses are well characterized examples of long-term synaptic plasticity. Persistent increased activity of protein kinase M ζ (PKMζ is thought essential for maintaining LTP. Additional spatial and temporal features that govern LTP and LTD induction are embodied in the synaptic tagging and capture (STC and cross capture hypotheses. Only synapses that have been "tagged" by a stimulus sufficient for LTP and learning can "capture" PKMζ. A model was developed to simulate the dynamics of key molecules required for LTP and LTD. The model concisely represents relationships between tagging, capture, LTD, and LTP maintenance. The model successfully simulated LTP maintained by persistent synaptic PKMζ, STC, LTD, and cross capture, and makes testable predictions concerning the dynamics of PKMζ. The maintenance of LTP, and consequently of at least some forms of long-term memory, is predicted to require continual positive feedback in which PKMζ enhances its own synthesis only at potentiated synapses. This feedback underlies bistability in the activity of PKMζ. Second, cross capture requires the induction of LTD to induce dendritic PKMζ synthesis, although this may require tagging of a nearby synapse for LTP. The model also simulates the effects of PKMζ inhibition, and makes additional predictions for the dynamics of CaM kinases. Experiments testing the above predictions would significantly advance the understanding of memory maintenance.

  13. Molecular constraints on synaptic tagging and maintenance of long-term potentiation: a predictive model.

    Science.gov (United States)

    Smolen, Paul; Baxter, Douglas A; Byrne, John H

    2012-01-01

    Protein synthesis-dependent, late long-term potentiation (LTP) and depression (LTD) at glutamatergic hippocampal synapses are well characterized examples of long-term synaptic plasticity. Persistent increased activity of protein kinase M ζ (PKMζ) is thought essential for maintaining LTP. Additional spatial and temporal features that govern LTP and LTD induction are embodied in the synaptic tagging and capture (STC) and cross capture hypotheses. Only synapses that have been "tagged" by a stimulus sufficient for LTP and learning can "capture" PKMζ. A model was developed to simulate the dynamics of key molecules required for LTP and LTD. The model concisely represents relationships between tagging, capture, LTD, and LTP maintenance. The model successfully simulated LTP maintained by persistent synaptic PKMζ, STC, LTD, and cross capture, and makes testable predictions concerning the dynamics of PKMζ. The maintenance of LTP, and consequently of at least some forms of long-term memory, is predicted to require continual positive feedback in which PKMζ enhances its own synthesis only at potentiated synapses. This feedback underlies bistability in the activity of PKMζ. Second, cross capture requires the induction of LTD to induce dendritic PKMζ synthesis, although this may require tagging of a nearby synapse for LTP. The model also simulates the effects of PKMζ inhibition, and makes additional predictions for the dynamics of CaM kinases. Experiments testing the above predictions would significantly advance the understanding of memory maintenance.

  14. Small discussion of electromagnetic wave anomalies preceding earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    1980-01-01

    Six brief pieces on various aspects of electromagnetic wave anomalies are presented. They cover: earthquake electromagnetic emanations; the use of magnetic induction information for earthquake forecasting; electromagnetic pulse emissions as pre-earthquake indicators; the use of magnetic sensors to determine medium-wavelength field strength for earthquake prediction purposes; magnetic deviation indicators inside reinforced-concrete buildings; and a discussion of the general physical principles involved.

  15. Predicting short-term weight loss using four leading health behavior change theories

    Directory of Open Access Journals (Sweden)

    Barata José T

    2007-04-01

    Full Text Available Abstract Background This study was conceived to analyze how exercise and weight management psychosocial variables, derived from several health behavior change theories, predict weight change in a short-term intervention. The theories under analysis were the Social Cognitive Theory, the Transtheoretical Model, the Theory of Planned Behavior, and Self-Determination Theory. Methods Subjects were 142 overweight and obese women (BMI = 30.2 ± 3.7 kg/m2; age = 38.3 ± 5.8y, participating in a 16-week University-based weight control program. Body weight and a comprehensive psychometric battery were assessed at baseline and at program's end. Results Weight decreased significantly (-3.6 ± 3.4%, p Conclusion The present models were able to predict 20–30% of variance in short-term weight loss and changes in weight management self-efficacy accounted for a large share of the predictive power. As expected from previous studies, exercise variables were only moderately associated with short-term outcomes; they are expected to play a larger explanatory role in longer-term results.

  16. Fundamental Frequency Variation of Neonatal Spontaneous Crying Predicts Language Acquisition in Preterm and Term Infants.

    Science.gov (United States)

    Shinya, Yuta; Kawai, Masahiko; Niwa, Fusako; Imafuku, Masahiro; Myowa, Masako

    2017-01-01

    Spontaneous cries of infants exhibit rich melodic features (i.e., time variation of fundamental frequency [ F 0 ]) even during the neonatal period, and the development of these characteristics might provide an essential base for later expressive prosody in language. However, little is known about the melodic features of spontaneous cries in preterm infants, who have a higher risk of later language-related problems. Thus, the present study investigated how preterm birth influenced melodic features of spontaneous crying at term-equivalent age as well as how these melodic features related to language outcomes at 18 months of corrected age in preterm and term infants. At term, moderate-to-late preterm (MLP) infants showed spontaneous cries with significantly higher F 0 variation and melody complexity than term infants, while there were no significant differences between very preterm (VP) and term infants. Furthermore, larger F 0 variation within cry series at term was significantly related to better language and cognitive outcomes, particularly expressive language skills, at 18 months. On the other hand, no other melodic features at term predicted any developmental outcomes at 18 months. The present results suggest that the additional postnatal vocal experience of MLP preterm infants increased F 0 variation and the complexity of spontaneous cries at term. Additionally, the increases in F 0 variation may partly reflect the development of voluntary vocal control, which, in turn, contributes to expressive language in infancy.

  17. Ultra-Short-Term Wind Power Prediction Using a Hybrid Model

    Science.gov (United States)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    This paper aims to develop and apply a hybrid model of two data analytical methods, multiple linear regressions and least square (MLR&LS), for ultra-short-term wind power prediction (WPP), for example taking, Northeast China electricity demand. The data was obtained from the historical records of wind power from an offshore region, and from a wind farm of the wind power plant in the areas. The WPP achieved in two stages: first, the ratios of wind power were forecasted using the proposed hybrid method, and then the transformation of these ratios of wind power to obtain forecasted values. The hybrid model combines the persistence methods, MLR and LS. The proposed method included two prediction types, multi-point prediction and single-point prediction. WPP is tested by applying different models such as autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA) and artificial neural network (ANN). By comparing results of the above models, the validity of the proposed hybrid model is confirmed in terms of error and correlation coefficient. Comparison of results confirmed that the proposed method works effectively. Additional, forecasting errors were also computed and compared, to improve understanding of how to depict highly variable WPP and the correlations between actual and predicted wind power.

  18. Prediction of Sea Surface Temperature Using Long Short-Term Memory

    Science.gov (United States)

    Zhang, Qin; Wang, Hui; Dong, Junyu; Zhong, Guoqiang; Sun, Xin

    2017-10-01

    This letter adopts long short-term memory(LSTM) to predict sea surface temperature(SST), which is the first attempt, to our knowledge, to use recurrent neural network to solve the problem of SST prediction, and to make one week and one month daily prediction. We formulate the SST prediction problem as a time series regression problem. LSTM is a special kind of recurrent neural network, which introduces gate mechanism into vanilla RNN to prevent the vanished or exploding gradient problem. It has strong ability to model the temporal relationship of time series data and can handle the long-term dependency problem well. The proposed network architecture is composed of two kinds of layers: LSTM layer and full-connected dense layer. LSTM layer is utilized to model the time series relationship. Full-connected layer is utilized to map the output of LSTM layer to a final prediction. We explore the optimal setting of this architecture by experiments and report the accuracy of coastal seas of China to confirm the effectiveness of the proposed method. In addition, we also show its online updated characteristics.

  19. A Long-Term Prediction Model of Beijing Haze Episodes Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Xiaoping Yang

    2016-01-01

    Full Text Available The rapid industrial development has led to the intermittent outbreak of pm2.5 or haze in developing countries, which has brought about great environmental issues, especially in big cities such as Beijing and New Delhi. We investigated the factors and mechanisms of haze change and present a long-term prediction model of Beijing haze episodes using time series analysis. We construct a dynamic structural measurement model of daily haze increment and reduce the model to a vector autoregressive model. Typical case studies on 886 continuous days indicate that our model performs very well on next day’s Air Quality Index (AQI prediction, and in severely polluted cases (AQI ≥ 300 the accuracy rate of AQI prediction even reaches up to 87.8%. The experiment of one-week prediction shows that our model has excellent sensitivity when a sudden haze burst or dissipation happens, which results in good long-term stability on the accuracy of the next 3–7 days’ AQI prediction.

  20. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  1. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  2. Unsaturated consolidation theory for the prediction of long-term municipal solid waste landfill settlement.

    Science.gov (United States)

    Liu, Chia-Nan; Chen, Rong-Her; Chen, Kuo-Sheng

    2006-02-01

    The understanding of long-term landfill settlement is important for landfill design and rehabilitation. However, suitable models that can consider both the mechanical and biodecomposition mechanisms in predicting the long-term landfill settlement are generally not available. In this paper, a model based on unsaturated consolidation theory and considering the biodegradation process is introduced to simulate the landfill settlement behaviour. The details of problem formulations and the derivation of the solution for the formulated differential equation of gas pressure are presented. A step-by-step analytical procedure employing this approach for estimating settlement is proposed. The proposed model can generally model the typical features of short-term and long-term behaviour. The proposed model also yields results that are comparable with the field measurements.

  3. A hybrid PSO-ANFIS approach for short-term wind power prediction in Portugal

    International Nuclear Information System (INIS)

    Pousinho, H.M.I.; Mendes, V.M.F.; Catalao, J.P.S.

    2011-01-01

    The increased integration of wind power into the electric grid, as nowadays occurs in Portugal, poses new challenges due to its intermittency and volatility. Wind power prediction plays a key role in tackling these challenges. The contribution of this paper is to propose a new hybrid approach, combining particle swarm optimization and adaptive-network-based fuzzy inference system, for short-term wind power prediction in Portugal. Significant improvements regarding forecasting accuracy are attainable using the proposed approach, in comparison with the results obtained with five other approaches.

  4. A hybrid PSO-ANFIS approach for short-term wind power prediction in Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Pousinho, H.M.I. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Mendes, V.M.F. [Department of Electrical Engineering and Automation, Instituto Superior de Engenharia de Lisboa, R. Conselheiro Emidio Navarro, 1950-062 Lisbon (Portugal); Catalao, J.P.S. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Center for Innovation in Electrical and Energy Engineering, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais, 1049-001 Lisbon (Portugal)

    2011-01-15

    The increased integration of wind power into the electric grid, as nowadays occurs in Portugal, poses new challenges due to its intermittency and volatility. Wind power prediction plays a key role in tackling these challenges. The contribution of this paper is to propose a new hybrid approach, combining particle swarm optimization and adaptive-network-based fuzzy inference system, for short-term wind power prediction in Portugal. Significant improvements regarding forecasting accuracy are attainable using the proposed approach, in comparison with the results obtained with five other approaches. (author)

  5. Scientific basis for long-term prediction of waste-form performance under repository conditions

    International Nuclear Information System (INIS)

    Mendel, J.E.

    1982-10-01

    This paper presents an overview of the fundamental principles involved in predicting long-term performance of waste forms by the as-low-as-reasonably-achievable approach. Repository conditions which make up the waste-form environment, the aging of the waste form, the important radionuclides in the waste form, the chemistry of repository fluids, and multicomponent interactions testing were considered in order to describe these principles. The need for confidence limits on the prediction of waste-form performance and ways of achieving a definition of the confidence limits are discussed

  6. Prediction of long-term crustal movement for geological disposal of radioactive waste

    International Nuclear Information System (INIS)

    Sasaki, Takeshi; Morikawa, Seiji; Tabei, Kazuto; Koide, Hitoshi; Tashiro, Toshiharu

    2000-01-01

    Long-term stability of the geological environment is essential for the safe geological disposal of radioactive waste, for which it is necessary to predict the crustal movement during an assessment period. As a case study, a numerical analysis method for the prediction of crustal movement in Japan is proposed. A three-dimensional elastic analysis by FEM for the geological block structure of the Kinki region and the Awaji-Rokko area is presented. Stability analysis for a disposal cavern is also investigated. (author)

  7. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  8. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  9. Predicting short-term mortality and long-term survival for hospitalized US patients with alcoholic hepatitis.

    Science.gov (United States)

    Cuthbert, Jennifer A; Arslanlar, Sami; Yepuri, Jay; Montrose, Marc; Ahn, Chul W; Shah, Jessica P

    2014-07-01

    No study has evaluated current scoring systems for their accuracy in predicting short and long-term outcome of alcoholic hepatitis in a US population. We reviewed electronic records for patients with alcoholic liver disease (ALD) admitted to Parkland Memorial Hospital between January 2002 and August 2005. Data and outcomes for 148 of 1,761 admissions meeting pre-defined criteria were collected. The discriminant function (DF) was revised (INRdf) to account for changes in prothrombin time reagents that could potentially affect identification of risk using the previous DF threshold of >32. Admission and theoretical peak scores were calculated by use of the Model for End-stage Liver Disease (MELD). Analysis models compared five different scoring systems. INRdf was closely correlated with the old DF (r (2) = 0.95). Multivariate analysis of the data showed that survival for 28 days was significantly associated with a scoring system using a combination of age, bilirubin, coagulation status, and creatinine (p short-term mortality (p 50 % mortality at four weeks and >80 % mortality at six months without specific treatment.

  10. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  11. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory.

    Science.gov (United States)

    Yang, Haimin; Pan, Zhisong; Tao, Qing

    2017-01-01

    Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam), for long short-term memory (LSTM) to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  12. Short-term PV/T module temperature prediction based on PCA-RBF neural network

    Science.gov (United States)

    Li, Jiyong; Zhao, Zhendong; Li, Yisheng; Xiao, Jing; Tang, Yunfeng

    2018-02-01

    Aiming at the non-linearity and large inertia of temperature control in PV/T system, short-term temperature prediction of PV/T module is proposed, to make the PV/T system controller run forward according to the short-term forecasting situation to optimize control effect. Based on the analysis of the correlation between PV/T module temperature and meteorological factors, and the temperature of adjacent time series, the principal component analysis (PCA) method is used to pre-process the original input sample data. Combined with the RBF neural network theory, the simulation results show that the PCA method makes the prediction accuracy of the network model higher and the generalization performance stronger than that of the RBF neural network without the main component extraction.

  13. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  14. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  15. Long short-term memory neural network for air pollutant concentration predictions: Method development and evaluation.

    Science.gov (United States)

    Li, Xiang; Peng, Ling; Yao, Xiaojing; Cui, Shaolong; Hu, Yuan; You, Chengzeng; Chi, Tianhe

    2017-12-01

    Air pollutant concentration forecasting is an effective method of protecting public health by providing an early warning against harmful air pollutants. However, existing methods of air pollutant concentration prediction fail to effectively model long-term dependencies, and most neglect spatial correlations. In this paper, a novel long short-term memory neural network extended (LSTME) model that inherently considers spatiotemporal correlations is proposed for air pollutant concentration prediction. Long short-term memory (LSTM) layers were used to automatically extract inherent useful features from historical air pollutant data, and auxiliary data, including meteorological data and time stamp data, were merged into the proposed model to enhance the performance. Hourly PM 2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 μm) concentration data collected at 12 air quality monitoring stations in Beijing City from Jan/01/2014 to May/28/2016 were used to validate the effectiveness of the proposed LSTME model. Experiments were performed using the spatiotemporal deep learning (STDL) model, the time delay neural network (TDNN) model, the autoregressive moving average (ARMA) model, the support vector regression (SVR) model, and the traditional LSTM NN model, and a comparison of the results demonstrated that the LSTME model is superior to the other statistics-based models. Additionally, the use of auxiliary data improved model performance. For the one-hour prediction tasks, the proposed model performed well and exhibited a mean absolute percentage error (MAPE) of 11.93%. In addition, we conducted multiscale predictions over different time spans and achieved satisfactory performance, even for 13-24 h prediction tasks (MAPE = 31.47%). Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Neonatal Pulmonary MRI of Bronchopulmonary Dysplasia Predicts Short-term Clinical Outcomes.

    Science.gov (United States)

    Higano, Nara S; Spielberg, David R; Fleck, Robert J; Schapiro, Andrew H; Walkup, Laura L; Hahn, Andrew D; Tkach, Jean A; Kingma, Paul S; Merhar, Stephanie L; Fain, Sean B; Woods, Jason C

    2018-05-23

    Bronchopulmonary dysplasia (BPD) is a serious neonatal pulmonary condition associated with premature birth, but the underlying parenchymal disease and trajectory are poorly characterized. The current NICHD/NHLBI definition of BPD severity is based on degree of prematurity and extent of oxygen requirement. However, no clear link exists between initial diagnosis and clinical outcomes. We hypothesized that magnetic resonance imaging (MRI) of structural parenchymal abnormalities will correlate with NICHD-defined BPD disease severity and predict short-term respiratory outcomes. Forty-two neonates (20 severe BPD, 6 moderate, 7 mild, 9 non-BPD controls; 40±3 weeks post-menstrual age) underwent quiet-breathing structural pulmonary MRI (ultrashort echo-time and gradient echo) in a NICU-sited, neonatal-sized 1.5T scanner, without sedation or respiratory support unless already clinically prescribed. Disease severity was scored independently by two radiologists. Mean scores were compared to clinical severity and short-term respiratory outcomes. Outcomes were predicted using univariate and multivariable models including clinical data and scores. MRI scores significantly correlated with severities and predicted respiratory support at NICU discharge (P<0.0001). In multivariable models, MRI scores were by far the strongest predictor of respiratory support duration over clinical data, including birth weight and gestational age. Notably, NICHD severity level was not predictive of discharge support. Quiet-breathing neonatal pulmonary MRI can independently assess structural abnormalities of BPD, describe disease severity, and predict short-term outcomes more accurately than any individual standard clinical measure. Importantly, this non-ionizing technique can be implemented to phenotype disease and has potential to serially assess efficacy of individualized therapies.

  17. Time-Series Prediction: Application to the Short-Term Electric Energy Demand

    OpenAIRE

    Troncoso Lora, Alicia; Riquelme Santos, Jesús Manuel; Riquelme Santos, José Cristóbal; Gómez Expósito, Antonio; Martínez Ramos, José Luis

    2003-01-01

    This paper describes a time-series prediction method based on the kNN technique. The proposed methodology is applied to the 24-hour load forecasting problem. Also, based on recorded data, an alternative model is developed by means of a conventional dynamic regression technique, where the parameters are estimated by solving a least squares problem. Finally, results obtained from the application of both techniques to the Spanish transmission system are compared in terms of maximum, average and ...

  18. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  19. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  20. Reduction of wind power induced reserve requirements by advanced shortest-term forecasts and prediction intervals

    Energy Technology Data Exchange (ETDEWEB)

    Dobschinski, Jan; Wessel, Arne; Lange, Bernhard; Bremen, Lueder von [Fraunhofer Institut fuer Windenergie und Energiesystemtechnik (IWES), Kassel (Germany)

    2009-07-01

    In electricity systems with large penetration of wind power, the limited predictability of the wind power generation leads to an increase in reserve and balancing requirements. At first the present study concentrates on the capability of dynamic day-ahead prediction intervals to reduce the wind power induced reserve and balancing requirements. Alternatively the reduction of large forecast errors of the German wind power generation by using advanced shortest-term predictions has been evaluated in a second approach. With focus on the allocation of minute reserve power the aim is to estimate the maximal remaining uncertainty after trading activities on the intraday market. Finally both approaches were used in a case study concerning the reserve requirements induced by the total German wind power expansion in 2007. (orig.)

  1. Long short-term memory neural network for air pollutant concentration predictions: Method development and evaluation

    International Nuclear Information System (INIS)

    Li, Xiang; Peng, Ling; Yao, Xiaojing; Cui, Shaolong; Hu, Yuan; You, Chengzeng; Chi, Tianhe

    2017-01-01

    Air pollutant concentration forecasting is an effective method of protecting public health by providing an early warning against harmful air pollutants. However, existing methods of air pollutant concentration prediction fail to effectively model long-term dependencies, and most neglect spatial correlations. In this paper, a novel long short-term memory neural network extended (LSTME) model that inherently considers spatiotemporal correlations is proposed for air pollutant concentration prediction. Long short-term memory (LSTM) layers were used to automatically extract inherent useful features from historical air pollutant data, and auxiliary data, including meteorological data and time stamp data, were merged into the proposed model to enhance the performance. Hourly PM 2.5 (particulate matter with an aerodynamic diameter less than or equal to 2.5 μm) concentration data collected at 12 air quality monitoring stations in Beijing City from Jan/01/2014 to May/28/2016 were used to validate the effectiveness of the proposed LSTME model. Experiments were performed using the spatiotemporal deep learning (STDL) model, the time delay neural network (TDNN) model, the autoregressive moving average (ARMA) model, the support vector regression (SVR) model, and the traditional LSTM NN model, and a comparison of the results demonstrated that the LSTME model is superior to the other statistics-based models. Additionally, the use of auxiliary data improved model performance. For the one-hour prediction tasks, the proposed model performed well and exhibited a mean absolute percentage error (MAPE) of 11.93%. In addition, we conducted multiscale predictions over different time spans and achieved satisfactory performance, even for 13–24 h prediction tasks (MAPE = 31.47%). - Highlights: • Regional air pollutant concentration shows an obvious spatiotemporal correlation. • Our prediction model presents superior performance. • Climate data and metadata can significantly

  2. A prediction method for long-term behavior of prestressed concrete containment vessels

    International Nuclear Information System (INIS)

    Ozaki, M.; Abe, T.; Watanabe, Y.; Kato, A.; Yamaguchi, T.; Yamamoto, M.

    1995-01-01

    This paper presents results of studies on the long-term behavior of PCCVs at Taruga Unit No 2 and Ohi Unit No 3/4 power stations. The objective of this study is to evaluate the measured strain in the concrete and reduction force in the tendons, and to establish the prediction methods for long-term PCCVs behavior. Comparing the measured strains with those calculated due to creep and shrinkage of the concrete, those in contrast were investigated. Furthermore, the reduced tendon forces are calculated considering losses in elasticity, relaxation, creep and shrinkage. The measured reduction in the tendon forces is compared with the calculated. Considering changes in temperature and humidity, the measured strains and tendon forces were in good agreement with those calculated. From the above results, it was confirmed that the residual pre stresses in the PCCVs maintain the predicted values at the design stage, and that the prediction method of long-term behaviors has sufficient reliability. (author). 10 refs., 8 figs., 3 tabs

  3. Predicting visual semantic descriptive terms from radiological image data: preliminary results with liver lesions in CT.

    Science.gov (United States)

    Depeursinge, Adrien; Kurtz, Camille; Beaulieu, Christopher; Napel, Sandy; Rubin, Daniel

    2014-08-01

    We describe a framework to model visual semantics of liver lesions in CT images in order to predict the visual semantic terms (VST) reported by radiologists in describing these lesions. Computational models of VST are learned from image data using linear combinations of high-order steerable Riesz wavelets and support vector machines (SVM). In a first step, these models are used to predict the presence of each semantic term that describes liver lesions. In a second step, the distances between all VST models are calculated to establish a nonhierarchical computationally-derived ontology of VST containing inter-term synonymy and complementarity. A preliminary evaluation of the proposed framework was carried out using 74 liver lesions annotated with a set of 18 VSTs from the RadLex ontology. A leave-one-patient-out cross-validation resulted in an average area under the ROC curve of 0.853 for predicting the presence of each VST. The proposed framework is expected to foster human-computer synergies for the interpretation of radiological images while using rotation-covariant computational models of VSTs to 1) quantify their local likelihood and 2) explicitly link them with pixel-based image content in the context of a given imaging domain.

  4. Rethinking earthquake-related DC-ULF electromagnetic phenomena: towards a physics-based approach

    Directory of Open Access Journals (Sweden)

    Q. Huang

    2011-11-01

    Full Text Available Numerous electromagnetic changes possibly related with earthquakes have been independently reported and have even been attempted to apply to short-term prediction of earthquakes. However, there are active debates on the above issue because the seismogenic process is rather complicated and the studies have been mainly empirical (i.e. a kind of experience-based approach. Thus, a physics-based study would be helpful for understanding earthquake-related electromagnetic phenomena and strengthening their applications. As a potential physics-based approach, I present an integrated research scheme, taking into account the interaction among observation, methodology, and physical model. For simplicity, this work focuses only on the earthquake-related DC-ULF electromagnetic phenomena. The main approach includes the following key problems: (1 how to perform a reliable and appropriate observation with some clear physical quantities; (2 how to develop a robust methodology to reveal weak earthquake-related electromagnetic signals from noisy background; and (3 how to develop plausible physical models based on theoretical analyses and/or laboratory experiments for the explanation of the earthquake-related electromagnetic signals observed in the field conditions.

  5. Theta coupling between V4 and prefrontal cortex predicts visual short-term memory performance.

    Science.gov (United States)

    Liebe, Stefanie; Hoerzer, Gregor M; Logothetis, Nikos K; Rainer, Gregor

    2012-01-29

    Short-term memory requires communication between multiple brain regions that collectively mediate the encoding and maintenance of sensory information. It has been suggested that oscillatory synchronization underlies intercortical communication. Yet, whether and how distant cortical areas cooperate during visual memory remains elusive. We examined neural interactions between visual area V4 and the lateral prefrontal cortex using simultaneous local field potential (LFP) recordings and single-unit activity (SUA) in monkeys performing a visual short-term memory task. During the memory period, we observed enhanced between-area phase synchronization in theta frequencies (3-9 Hz) of LFPs together with elevated phase locking of SUA to theta oscillations across regions. In addition, we found that the strength of intercortical locking was predictive of the animals' behavioral performance. This suggests that theta-band synchronization coordinates action potential communication between V4 and prefrontal cortex that may contribute to the maintenance of visual short-term memories.

  6. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  7. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  8. Tiltmeter studies in earthquake prediction

    Science.gov (United States)

    Johnston, M.

    1978-01-01

    Our knowledge is still very limited as to the way in which the Earth's surface deforms around active faults and why it does so. By far the easiest method of providing clues to the mechanisms involved is to record the associated pattern of tilt of the Earth's surface. 

  9. A comparative study on prediction methods for China's medium- and long-term coal demand

    International Nuclear Information System (INIS)

    Li, Bing-Bing; Liang, Qiao-Mei; Wang, Jin-Cheng

    2015-01-01

    Given the dominant position of coal in China's energy structure and in order to ensure a safe and stable energy supply, it is essential to perform a scientific and effective prediction of China's medium- and long-term coal demand. Based on the historical data of coal consumption and related factors such as GDP (Gross domestic product), coal price, industrial structure, total population, energy structure, energy efficiency, coal production and urbanization rate from 1987 to 2012, this study compared the prediction effects of five types of models. These models include the VAR (vector autoregressive model), RBF (radial basis function) neural network model, GA-DEM (genetic algorithm demand estimation model), PSO-DEM (particle swarm optimization demand estimation model) and IO (input–output model). By comparing the results of different models with the corresponding actual coal consumption, it is concluded that with a testing period from 2006 to 2012, the PSO-DEM model has a relatively optimal predicted effect on China's total coal demand, where the MAPE (mean absolute percentage error) is close to or below 2%. - Highlights: • The prediction effects of five methods for China's coal demand were compared. • Each model has acceptable prediction results, with MAPE below 5%. • Particle swarm optimization demand estimation model has better forecast efficacy.

  10. Predicting long-term moisture contents of earthen covers at uranium mill tailings sites

    International Nuclear Information System (INIS)

    Gee, G.W.; Nielson, K.K.; Rogers, V.C.

    1984-09-01

    The three methods for long-term moisture prediction covered in this report are: estimates from water retention (permanent wilting point) data, correlation with climate and soil type, and detailed model simulation. The test results have shown: soils vary greatly in residual moisture. Expected long-term moisture saturation ratios (based on generalized soil characteristics) range from 0.2 to 0.8 for soils ranging in texture from sand to clay, respectively. These values hold for noncompacted field soils. Measured radon diffusion coefficients for soils at 15-bar water contents ranged from 5.0E-2 cm 2 /s to 5.0E-3 cm 2 /s for sands and clays, respectively, at typical field densities. In contrast, fine-textured pit-run earthen materials, subjected to optimum compaction (>85% Proctor density) and dried to the 15-bar water content, ranged from 0.7 to 0.9 moisture saturation. Compacted pit-run soils at these moisture contents exhibited radon diffusion coefficients as low as 3.0E-4 cm 2 /s. The residual moisture saturation for cover soils is not known since no engineered barrier has been in place for more than a few years. A comparison of methods for predicting moisture saturation indicates that model simulations are useful for predicting effects of climatic changes on residual soil moisture, but that long-term moisture also can be predicted with some degree of confidence using generalized soil properties or empirical correlations based both on soils and climatic information. The optimal soil cover design will likely include more than one layer of soil. A two-layer system using a thick (1-m minimum) plant root zone of uncompacted soil placed over a moistened, tightly compacted fine-textured soil is recommended. This design concept has been tested successfully at the Grand Junction, Colorado, tailings piles

  11. Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand

    Science.gov (United States)

    Nekrasova, A.; Kossobokov, V. G.

    2017-12-01

    We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.

  12. Prediction of the Long Term Cooling Performance for the 3-Pin Fuel Test Loop

    Energy Technology Data Exchange (ETDEWEB)

    Park, S. K.; Chi, D. Y.; Sim, B. S.; Park, K. N.; Ahn, S. H.; Lee, J. M.; Lee, C. Y.; Kim, H. R

    2005-12-15

    In the long term cooling phase that the emergency cooling water injection ends, the performance of the residual heat removal for the 3-pin fuel test loop has been predicted by a simplified heat transfer model. In the long term cooling phase the residual heat is 1323W for PWR fuel test mode and 1449W for CANDU fuel test mode. The each residual heat is assumed as 2% of the fission power of the test fuel used in the anticipated operational occurrence and design basis accident analyses. The each fission power used for the analyses is 105% of the rated fission power in the normal operation. In the long term cooling phase the residual heat is removed to the HANARO pool through the double pressure vessels of the in-pile test section. Saturate pooling boiling is assumed on the test fuel and condensation heat transfer is expected on the inner wall of the fuel carrier and the flow divider. Natural convection heat transfer on a heated vertical wall is also assumed on the outer wall of the outer pressure vessel. The conduction heat transfer is only considered in the gap between the double pressure vessels charged with neon gas and in the downcomer filled with coolant. The heat transfer rate between the coolant temperature of 152 .deg. C in the in-pile test section and the water temperature of 45 .deg. C in the HANARO pool is predicted as about 1666W. The 152 .deg. C is the saturate temperature of the coolant pressure predicted from the MARS code. The cooling capacity of 1666W is greater than the residual heats of 1323W and 1449W. Consequently the long term cooling performance of the 3-pin fuel test loop is sufficient for the anticipated operational occurrences and design basis accidents.

  13. An adaptive short-term prediction scheme for wind energy storage management

    International Nuclear Information System (INIS)

    Blonbou, Ruddy; Monjoly, Stephanie; Dorville, Jean-Francois

    2011-01-01

    Research highlights: → We develop a real time algorithm for grid-connected wind energy storage management. → The method aims to guarantee, with ±5% error margin, the power sent to the grid. → Dynamic scheduling of energy storage is based on short-term energy prediction. → Accurate predictions reduce the need in storage capacity. -- Abstract: Efficient forecasting scheme that includes some information on the likelihood of the forecast and based on a better knowledge of the wind variations characteristics along with their influence on power output variation is of key importance for the optimal integration of wind energy in island's power system. In the Guadeloupean archipelago (French West-Indies), with a total wind power capacity of 25 MW; wind energy can represent up to 5% of the instantaneous electricity production. At this level, wind energy contribution can be equivalent to the current network primary control reserve, which causes balancing difficult. The share of wind energy is due to grow even further since the objective is set to reach 118 MW by 2020. It is an absolute evidence for the network operator that due to security concerns of the electrical grid, the share of wind generation should not increase unless solutions are found to solve the prediction problem. The University of French West-Indies and Guyana has developed a short-term wind energy prediction scheme that uses artificial neural networks and adaptive learning procedures based on Bayesian approach and Gaussian approximation. This paper reports the results of the evaluation of the proposed approach; the improvement with respect to the simple persistent prediction model was globally good. A discussion on how such a tool combined with energy storage capacity could help to smooth the wind power variation and improve the wind energy penetration rate into island utility network is also proposed.

  14. Interevent times in a new alarm-based earthquake forecasting model

    Science.gov (United States)

    Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed

    2013-09-01

    This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the

  15. The use of radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Ramola, R.C.; Singh, M.; Sandhu, A.S.; Singh, S.; Virk, H.S.

    1990-01-01

    Radon monitoring for earthquake prediction is part of an integral approach since the discovery of coherent and time anomalous radon concentrations prior to, during and after the 1966 Tashkent earthquake. In this paper some studies of groundwater and soil gas radon content in relation to earthquake activities are reviewed. Laboratory experiments and the development of groundwater and soil gas radon monitoring systems are described. In addition, radon monitoring studies conducted at the Guru Nanak Dev University Campus since 1986 are presented in detail. During these studies some anomalous changes in radon concentration were recorded before earthquakes occurred in the region. The anomalous radon increases are independent of meteorological conditions and appear to be caused by strain changes, which precede the earthquake. Anomalous changes in radon concentration before an earthquake suggest that radon monitoring can serve as an additional technique in the earthquake prediction programme in India. (author)

  16. Predicting long-term risk for relationship dissolution using nonparametric conditional survival trees.

    Science.gov (United States)

    Kliem, Sören; Weusthoff, Sarah; Hahlweg, Kurt; Baucom, Katherine J W; Baucom, Brian R

    2015-12-01

    Identifying risk factors for divorce or separation is an important step in the prevention of negative individual outcomes and societal costs associated with relationship dissolution. Programs that aim to prevent relationship distress and dissolution typically focus on changing processes that occur during couple conflict, although the predictive ability of conflict-specific variables has not been examined in the context of other factors related to relationship dissolution. The authors examine whether emotional responding and communication during couple conflict predict relationship dissolution after controlling for overall relationship quality and individual well-being. Using nonparametric conditional survival trees, the study at hand simultaneously examined the predictive abilities of physiological (systolic and diastolic blood pressure, heart rate, cortisol) and behavioral (fundamental frequency; f0) indices of emotional responding, as well as observationally coded positive and negative communication behavior, on long-term relationship stability after controlling for relationship satisfaction and symptoms of depression. One hundred thirty-six spouses were assessed after participating in a randomized clinical trial of a relationship distress prevention program as well as 11 years thereafter; 32.5% of the couples' relationships had dissolved by follow up. For men, the only significant predictor of relationship dissolution was cortisol change score (p = .012). For women, only f0 range was a significant predictor of relationship dissolution (p = .034). These findings highlight the importance of emotional responding during couple conflict for long-term relationship stability. (c) 2015 APA, all rights reserved).

  17. Prediction of long-term precipitate evolution in austenitic heat-resistant steels

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Jae-Hyeok; Jung, Woo-Sang; Cho, Young Whan [Korea Institute of Science and Technology, Seoul (Korea, Republic of). Materials/Devices Div.; Kozeschnik, Ernst [Vienna Univ. of Technology (Austria). Inst. of Materials Science and Technology

    2010-07-01

    Numerical prediction of the long-term precipitate evolution in five different austenitic heat-resistant stainless steels, NF709, Super304H, Sanicro25, CF8C-PLUS and HTUPS has been carried out. MX and M{sub 23}C{sub 6} are predicted to remain as major precipitates during long-term aging in these steels. The addition of 3 wt% Cu produces very fine Cu-rich precipitates during aging in Super304H and Sanicro25. It is found that the amount of Z phase start to increase remarkably between 1,000 and 10,000 hours of aging at the expense of MX precipitates in the steels containing a high nitrogen content. However, the growth rate of Z phase is relatively slow and its average size reaches at most a few tens of nanometers after 100,000 hours of aging at 700 C, compared with 9-12% Cr ferritic/martensitic heat-resistant steels. The predicted precipitation sequence and precipitate size during aging are in general agreement with experimental observations. (orig.)

  18. c-Fos expression predicts long-term social memory retrieval in mice.

    Science.gov (United States)

    Lüscher Dias, Thomaz; Fernandes Golino, Hudson; Moura de Oliveira, Vinícius Elias; Dutra Moraes, Márcio Flávio; Schenatto Pereira, Grace

    2016-10-15

    The way the rodent brain generally processes socially relevant information is rather well understood. How social information is stored into long-term social memory, however, is still under debate. Here, brain c-Fos expression was measured after adult mice were exposed to familiar or novel juveniles and expression was compared in several memory and socially relevant brain areas. Machine Learning algorithm Random Forest was then used to predict the social interaction category of adult mice based on c-Fos expression in these areas. Interaction with a familiar co-specific altered brain activation in the olfactory bulb, amygdala, hippocampus, lateral septum and medial prefrontal cortex. Remarkably, Random Forest was able to predict interaction with a familiar juvenile with 100% accuracy. Activity in the olfactory bulb, amygdala, hippocampus and the medial prefrontal cortex were crucial to this prediction. From our results, we suggest long-term social memory depends on initial social olfactory processing in the medial amygdala and its output connections synergistically with non-social contextual integration by the hippocampus and medial prefrontal cortex top-down modulation of primary olfactory structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Prediction method of long-term reliability in improving residual stresses by means of surface finishing

    International Nuclear Information System (INIS)

    Sera, Takehiko; Hirano, Shinro; Chigusa, Naoki; Okano, Shigetaka; Saida, Kazuyoshi; Mochizuki, Masahito; Nishimoto, Kazutoshi

    2012-01-01

    Surface finishing methods, such as Water Jet Peening (WJP), have been applied to welds in some major components of nuclear power plants as a counter measure to Primary Water Stress Corrosion Cracking (PWSCC). In addition, the methods of surface finishing (buffing treatment) is being standardized, and thus the buffing treatment has been also recognized as the well-established method of improving stress. On the other hand, the long-term stability of peening techniques has been confirmed by accelerated test. However, the effectiveness of stress improvement by surface treatment is limited to thin layers and the effect of complicated residual stress distribution in the weld metal beneath the surface is not strictly taken into account for long-term stability. This paper, therefore, describes the accelerated tests, which confirmed that the long-term stability of the layer subjected to buffing treatment was equal to that subjected to WJP. The long-term reliability of very thin stress improved layer was also confirmed through a trial evaluation by thermal elastic-plastic creep analysis, even if the effect of complicated residual stress distribution in the weld metal was excessively taken into account. Considering the above findings, an approach is proposed for constructing the prediction method of the long-term reliability of stress improvement by surface finishing. (author)

  20. Predicting outcome in term neonates with hypoxic-ischaemic encephalopathy using simplified MR criteria

    International Nuclear Information System (INIS)

    Jyoti, Rajeev; O'Neil, Ross

    2006-01-01

    MRI is an established investigation in the evaluation of neonates with suspected hypoxic-ischaemic encephalopathy (HIE). However, its role as a predictor of neurodevelopmental outcome remains complex. To establish reproducible simplified MR criteria and evaluate their role in predicting neurodevelopmental outcome in term neonates with HIE. Term neonates with suspected HIE had MRI at 7-10 days of age. MR scans were interpreted according to new simplified criteria by two radiologists blinded to the clinical course and outcome. The new simplified criteria allocated grade 1 to cases with no central and less than 10% peripheral change, grade 2 to those with less than 30% central and/or 10-30% peripheral area change, and grade 3 to those with more than 30% central or peripheral change. MRI changes were compared with clinical neurodevelopmental outcome evaluated prospectively at 1 year of age. Neurodevelopmental outcome was based upon the DQ score (revised Griffith's) and cerebral palsy on neurological assessment. Of 20 subjects, all those showing severe (grade 3) MR changes (35%) died or had poor neurodevelopmental outcome. Subjects with a normal MR scan or with scans showing only mild (grade 1) MR changes (55%) had normal outcomes. One subject showing moderate (grade 2) changes on MRI had a moderate outcome (5%), while another had an atypical pattern of MR changes with a normal outcome (5%). Assessment of full-term neonates with suspected HIE using the simplified MR criteria is highly predictive of neurodevelopmental outcome. (orig.)

  1. Predicting outcome in term neonates with hypoxic-ischaemic encephalopathy using simplified MR criteria

    Energy Technology Data Exchange (ETDEWEB)

    Jyoti, Rajeev; O' Neil, Ross [Canberra Hospital, Medical Imaging, Canberra, ACT (Australia)

    2006-01-01

    MRI is an established investigation in the evaluation of neonates with suspected hypoxic-ischaemic encephalopathy (HIE). However, its role as a predictor of neurodevelopmental outcome remains complex. To establish reproducible simplified MR criteria and evaluate their role in predicting neurodevelopmental outcome in term neonates with HIE. Term neonates with suspected HIE had MRI at 7-10 days of age. MR scans were interpreted according to new simplified criteria by two radiologists blinded to the clinical course and outcome. The new simplified criteria allocated grade 1 to cases with no central and less than 10% peripheral change, grade 2 to those with less than 30% central and/or 10-30% peripheral area change, and grade 3 to those with more than 30% central or peripheral change. MRI changes were compared with clinical neurodevelopmental outcome evaluated prospectively at 1 year of age. Neurodevelopmental outcome was based upon the DQ score (revised Griffith's) and cerebral palsy on neurological assessment. Of 20 subjects, all those showing severe (grade 3) MR changes (35%) died or had poor neurodevelopmental outcome. Subjects with a normal MR scan or with scans showing only mild (grade 1) MR changes (55%) had normal outcomes. One subject showing moderate (grade 2) changes on MRI had a moderate outcome (5%), while another had an atypical pattern of MR changes with a normal outcome (5%). Assessment of full-term neonates with suspected HIE using the simplified MR criteria is highly predictive of neurodevelopmental outcome. (orig.)

  2. Filling a gap: Public talks about earthquake preparation and the 'Big One'

    Science.gov (United States)

    Reinen, L. A.

    2013-12-01

    Residents of southern California are aware they live in a seismically active area and earthquake drills have trained us to Duck-Cover-Hold On. While many of my acquaintance are familiar with what to do during an earthquake, few have made preparations for living with the aftermath of a large earthquake. The ShakeOut Scenario (Jones et al., USGS Open File Report 2008-1150) describes the physical, social, and economic consequences of a plausible M7.8 earthquake on the southernmost San Andreas Fault. While not detailing an actual event, the ShakeOut Scenario illustrates how individual and community preparation may improve the potential after-affects of a major earthquake in the region. To address the gap between earthquake drills and preparation in my community, for the past several years I have been giving public talks to promote understanding of: the science behind the earthquake predictions; why individual, as well as community, preparation is important; and, ways in which individuals can prepare their home and work environments. The public presentations occur in an array of venues, including elementary school and college classes, a community forum linked with the annual ShakeOut Drill, and local businesses including the local microbrewery. While based on the same fundamental information, each presentation is modified for audience and setting. Assessment of the impact of these talks is primarily anecdotal and includes an increase in the number of venues requesting these talks, repeat invitations, and comments from audience members (sometimes months or years after a talk). I will present elements of these talks, the background information used, and examples of how they have affected change in the earthquake preparedness of audience members. Discussion and suggestions (particularly about effective means of conducting rigorous long-term assessment) are strongly encouraged.

  3. A hybrid measure-correlate-predict method for long-term wind condition assessment

    International Nuclear Information System (INIS)

    Zhang, Jie; Chowdhury, Souma; Messac, Achille; Hodge, Bri-Mathias

    2014-01-01

    Highlights: • A hybrid measure-correlate-predict (MCP) methodology with greater accuracy is developed. • Three sets of performance metrics are proposed to evaluate the hybrid MCP method. • Both wind speed and direction are considered in the hybrid MCP method. • The best combination of MCP algorithms is determined. • The developed hybrid MCP method is uniquely helpful for long-term wind resource assessment. - Abstract: This paper develops a hybrid measure-correlate-predict (MCP) strategy to assess long-term wind resource variations at a farm site. The hybrid MCP method uses recorded data from multiple reference stations to estimate long-term wind conditions at a target wind plant site with greater accuracy than is possible with data from a single reference station. The weight of each reference station in the hybrid strategy is determined by the (i) distance and (ii) elevation differences between the target farm site and each reference station. In this case, the wind data is divided into sectors according to the wind direction, and the MCP strategy is implemented for each wind direction sector separately. The applicability of the proposed hybrid strategy is investigated using five MCP methods: (i) the linear regression; (ii) the variance ratio; (iii) the Weibull scale; (iv) the artificial neural networks; and (v) the support vector regression. To implement the hybrid MCP methodology, we use hourly averaged wind data recorded at five stations in the state of Minnesota between 07-01-1996 and 06-30-2004. Three sets of performance metrics are used to evaluate the hybrid MCP method. The first set of metrics analyze the statistical performance, including the mean wind speed, wind speed variance, root mean square error, and mean absolute error. The second set of metrics evaluate the distribution of long-term wind speed; to this end, the Weibull distribution and the Multivariate and Multimodal Wind Distribution models are adopted. The third set of metrics analyze

  4. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  5. Using Long-Short-Term-Memory Recurrent Neural Networks to Predict Aviation Engine Vibrations

    Science.gov (United States)

    ElSaid, AbdElRahman Ahmed

    This thesis examines building viable Recurrent Neural Networks (RNN) using Long Short Term Memory (LSTM) neurons to predict aircraft engine vibrations. The different networks are trained on a large database of flight data records obtained from an airline containing flights that suffered from excessive vibration. RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical engine parameters, and this database contains multiple types of engines. Further, LSTM RNNs provide a "memory" of the contribution of previous time series data which can further improve predictions of future vibration values. LSTM RNNs were used over traditional RNNs, as those suffer from vanishing/exploding gradients when trained with back propagation. The study managed to predict vibration values for 1, 5, 10, and 20 seconds in the future, with 2.84% 3.3%, 5.51% and 10.19% mean absolute error, respectively. These neural networks provide a promising means for the future development of warning systems so that suitable actions can be taken before the occurrence of excess vibration to avoid unfavorable situations during flight.

  6. Multi-step prediction for influenza outbreak by an adjusted long short-term memory.

    Science.gov (United States)

    Zhang, J; Nawata, K

    2018-05-01

    Influenza results in approximately 3-5 million annual cases of severe illness and 250 000-500 000 deaths. We urgently need an accurate multi-step-ahead time-series forecasting model to help hospitals to perform dynamical assignments of beds to influenza patients for the annually varied influenza season, and aid pharmaceutical companies to formulate a flexible plan of manufacturing vaccine for the yearly different influenza vaccine. In this study, we utilised four different multi-step prediction algorithms in the long short-term memory (LSTM). The result showed that implementing multiple single-output prediction in a six-layer LSTM structure achieved the best accuracy. The mean absolute percentage errors from two- to 13-step-ahead prediction for the US influenza-like illness rates were all LSTM has been applied and refined to perform multi-step-ahead prediction for influenza outbreaks. Hopefully, this modelling methodology can be applied in other countries and therefore help prevent and control influenza worldwide.

  7. Modelling techniques for predicting the long term consequences of radiation on natural aquatic populations

    International Nuclear Information System (INIS)

    Wallis, I.G.

    1978-01-01

    The purpose of this working paper is to describe modelling techniques for predicting the long term consequences of radiation on natural aquatic populations. Ideally, it would be possible to use aquatic population models: (1) to predict changes in the health and well-being of all aquatic populations as a result of changing the composition, amount and location of radionuclide discharges; (2) to compare the effects of steady, fluctuating and accidental releases of radionuclides; and (3) to evaluate the combined impact of the discharge of radionuclides and other wastes, and natural environmental stresses on aquatic populations. At the onset it should be stated that there is no existing model which can achieve this ideal performance. However, modelling skills and techniques are available to develop useful aquatic population models. This paper discusses the considerations involved in developing these models and briefly describes the various types of population models which have been developed to date

  8. Serial-order short-term memory predicts vocabulary development: evidence from a longitudinal study.

    Science.gov (United States)

    Leclercq, Anne-Lise; Majerus, Steve

    2010-03-01

    Serial-order short-term memory (STM), as opposed to item STM, has been shown to be very consistently associated with lexical learning abilities in cross-sectional study designs. This study investigated longitudinal predictions between serial-order STM and vocabulary development. Tasks maximizing the temporary retention of either serial-order or item information were administered to kindergarten children aged 4 and 5. At age 4, age 5, and from age 4 to age 5, serial-order STM capacities, but not item STM capacities, were specifically associated with vocabulary development. Moreover, the increase of serial-order STM capacity from age 4 to age 5 predicted the increase of vocabulary knowledge over the same time period. These results support a theoretical position that assumes an important role for serial-order STM capacities in vocabulary acquisition.

  9. General Inattentiveness Is a Long-Term Reliable Trait Independently Predictive of Psychological Health

    DEFF Research Database (Denmark)

    Jensen, Christian Gaden; Niclasen, Janni; Vangkilde, Signe

    2016-01-01

    The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts, but the l......The Mindful Attention Awareness Scale (MAAS) measures perceived degree of inattentiveness in different contexts and is often used as a reversed indicator of mindfulness. MAAS is hypothesized to reflect a psychological trait or disposition when used outside attentional training contexts......, but the long-term test-retest reliability of MAAS scores is virtually untested. It is unknown whether MAAS predicts psychological health after controlling for standardized socioeconomic status classifications. First, MAAS translated to Danish was validated psychometrically within a randomly invited healthy...... adult community sample (N = 490). Factor analysis confirmed that MAAS scores quantified a unifactorial construct of excellent composite reliability and consistent convergent validity. Structural equation modeling revealed that MAAS scores contributed independently to predicting psychological distress...

  10. Predicting the short-term risk of diabetes in HIV-positive patients

    DEFF Research Database (Denmark)

    Petoumenos, Kathy; Worm, Signe Westring; Fontas, Eric

    2012-01-01

    Introduction: HIV-positive patients receiving combination antiretroviral therapy (cART) frequently experience metabolic complications such as dyslipidemia and insulin resistance, as well as lipodystrophy, increasing the risk of cardiovascular disease (CVD) and diabetes mellitus (DM). Rates of DM ......). Factors predictive of DM included higher glucose, body mass index (BMI) and triglyceride levels, and older age. Among HIV-related factors, recent CD4 counts of...... and other glucose-associated disorders among HIV-positive patients have been reported to range between 2 and 14%, and in an ageing HIV-positive population, the prevalence of DM is expected to continue to increase. This study aims to develop a model to predict the short-term (six-month) risk of DM in HIV...

  11. Earthquake Hazard Assessment: an Independent Review

    Science.gov (United States)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  12. Long-term prediction of chaotic time series with multi-step prediction horizons by a neural network with Levenberg-Marquardt learning algorithm

    International Nuclear Information System (INIS)

    Mirzaee, Hossein

    2009-01-01

    The Levenberg-Marquardt learning algorithm is applied for training a multilayer perception with three hidden layer each with ten neurons in order to carefully map the structure of chaotic time series such as Mackey-Glass time series. First the MLP network is trained with 1000 data, and then it is tested with next 500 data. After that the trained and tested network is applied for long-term prediction of next 120 data which come after test data. The prediction is such a way that, the first inputs to network for prediction are the four last data of test data, then the predicted value is shifted to the regression vector which is the input to the network, then after first four-step of prediction, the input regression vector to network is fully predicted values and in continue, each predicted data is shifted to input vector for subsequent prediction.

  13. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  14. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    International Nuclear Information System (INIS)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P.

    2016-01-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  15. Assessing Long-Term Wind Conditions by Combining Different Measure-Correlate-Predict Algorithms: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, J.; Chowdhury, S.; Messac, A.; Hodge, B. M.

    2013-08-01

    This paper significantly advances the hybrid measure-correlate-predict (MCP) methodology, enabling it to account for variations of both wind speed and direction. The advanced hybrid MCP method uses the recorded data of multiple reference stations to estimate the long-term wind condition at a target wind plant site. The results show that the accuracy of the hybrid MCP method is highly sensitive to the combination of the individual MCP algorithms and reference stations. It was also found that the best combination of MCP algorithms varies based on the length of the correlation period.

  16. Predicting the short-term risk of diabetes in HIV-positive patients

    DEFF Research Database (Denmark)

    Petoumenos, Kathy; Worm, Signe W; Fontas, Eric

    2012-01-01

    HIV-positive patients receiving combination antiretroviral therapy (cART) frequently experience metabolic complications such as dyslipidemia and insulin resistance, as well as lipodystrophy, increasing the risk of cardiovascular disease (CVD) and diabetes mellitus (DM). Rates of DM and other...... glucose-associated disorders among HIV-positive patients have been reported to range between 2 and 14%, and in an ageing HIV-positive population, the prevalence of DM is expected to continue to increase. This study aims to develop a model to predict the short-term (six-month) risk of DM in HIV...

  17. The role of nuclear techniques in the long-term prediction of radionuclide transport

    International Nuclear Information System (INIS)

    Airey, P.L.; Duerden, P.

    1985-01-01

    Problems associated with the long-term prediction of the migration of radionuclides, and the role of natural analogues in reducing the inherent uncertainties are discussed. Particular reference is made to the evaluation of uranium ore bodies in the Alligator Rivers region, Northern Territory, as analogues of high-level radioactive waste repositories. A range of nuclear techniques has been used to identify the role of colloids, of alpha recoil and of mineralogy in transport. Specific mention is made of a method being developed which enables models of the migration of solute through fractured rock to be assessed via a combination of alpha track, fission track and PIXE/PIGME techniques

  18. Mid- and long-term runoff predictions by an improved phase-space reconstruction model

    Energy Technology Data Exchange (ETDEWEB)

    Hong, Mei [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Wang, Dong, E-mail: wangdong@nju.edu.cn [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Wang, Yuankun; Zeng, Xiankui [Key Laboratory of Surficial Geochemistry, Ministry of Education, Department of Hydrosciences, School of Earth Sciences and Engineering, Collaborative Innovation Center of South China Sea Studies, State Key Laboratory of Pollution Control and Resource Reuse, Nanjing University, Nanjing 210093 (China); Ge, Shanshan; Yan, Hengqian [Research Center of Ocean Environment Numerical Simulation, Institute of Meteorology and oceanography, PLA University of Science and Technology, Nanjing (China); Singh, Vijay P. [Department of Biological and Agricultural Engineering Zachry Department of Civil Engineering, Texas A & M University, College Station, TX 77843 (United States)

    2016-07-15

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall–runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ‘‘wet years and dry years predictability barrier,’’ to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. - Highlights: • The improved phase-space reconstruction model of monthly runoff is established. • Two variables (temperature and rainfall) are incorporated

  19. Sonographical predictive markers of failure of induction of labour in term pregnancy.

    Science.gov (United States)

    Brik, Maia; Mateos, Silvia; Fernandez-Buhigas, Irene; Garbayo, Paloma; Costa, Gloria; Santacruz, Belen

    2017-02-01

    Predictive markers of failure of induction of labour in term pregnancy were evaluated. A prospective study including 245 women attending induction of labour was performed. The inclusion criteria were singleton pregnancies, gestational age 37-42 weeks and the main outcomes were failure of induction, induction to delivery interval and mode of delivery. Women with a longer cervical length prior to induction (CLpi) had a higher rate of failure of induction (30.9 ± 6.8 vs. 23.9 ± 9.3, p labour.

  20. CREATING THE KULTUK POLYGON FOR EARTHQUAKE PREDICTION: VARIATIONS OF (234U/238U AND 87SR/86SR IN GROUNDWATER FROM ACTIVE FAULTS AT THE WESTERN SHORE OF LAKE BAIKAL

    Directory of Open Access Journals (Sweden)

    S. V. Rasskazov

    2015-01-01

    Full Text Available Introduction. Determinations of (234U/238U in groundwater samples are used for monitoring current deformations in active faults (parentheses denote activity ratio units. The cyclic equilibrium of activity ratio 234U/238U≈≈(234U/238U≈γ≈1 corresponds to the atomic ratio ≈5.47×10–5. This parameter may vary due to higher contents of 234U nuclide in groundwater as a result of rock deformation. This effect discovered by P.I. Chalov and V.V. Cherdyntsev was described in [Cherdyntsev, 1969, 1973; Chalov, 1975; Chalov et al., 1990; Faure, 1989]. In 1970s and 1980s, only quite laborious methods were available for measuring uranium isotopic ratios. Today it is possible to determine concentrations and isotopic ration of uranium by express analytical techniques using inductively coupled plasma mass spectrometry (ICP‐MS [Halicz et al., 2000; Shen et al., 2002; Cizdziel et al., 2005; Chebykin et al., 2007]. Sets of samples canbe efficiently analysed by ICP‐MS, and regularly collected uranium isotope values can be systematized at a new quality level for the purposes of earthquake prediction. In this study of (234U/238U in groundwater at the Kultuk polygon, we selected stations of the highest sensitivity, which can ensure proper monitoring of the tectonic activity of the Obruchev and Main Sayan faults. These two faults that limit the Sharyzhalgai block of the crystalline basement of the Siberian craton in the south are conjugated in the territory of the Kultuk polygon (Fig 1. Forty sets of samples taken from 27 June 2012 to 28 January 2014 were analysed, and data on 170 samples are discussed in this paper.Methods. Isotope compositions of uranium and strontium were determined by methods described in [Chebykin et al., 2007; Pin et al., 1992] with modifications. Analyses of uranium by ISP‐MS technique were performed using an Agilent 7500ce quadrapole mass spectrometer of the Ultramicroanalysis Collective Use Centre; analyses of

  1. Ability of the MACRO Model to Predict Long-Term Leaching of Metribuzin and Diketometribuzin

    DEFF Research Database (Denmark)

    Rosenbom, Annette E; Kjær, Jeanne; Henriksen, Trine

    2009-01-01

    In a regulatory context, numerical models are increasingly employed to quantify leaching of pesticides and their metabolites. Although the ability of these models to accurately simulate leaching of pesticides has been evaluated, little is known about their ability to accurately simulate long...... alternative kinetics (a two-site approach), we captured the observed leaching scenario, thus underlining the necessity of accounting for the long-term sorption and dissipation characteristics when using models to predict the risk of groundwater contamination.......-term leaching of metabolites. A Danish study on the dissipation and sorption of metribuzin, involving both monitoring and batch experiments, concluded that desorption and degradation of metribuzin and leaching of its primary metabolite diketometribuzin continued for 5-6 years after application, posing a risk...

  2. Mortality in the l'aquila (central Italy) earthquake of 6 april 2009.

    Science.gov (United States)

    Alexander, David; Magni, Michele

    2013-01-07

    This paper presents the results of an analysis of data on mortality in the magnitude 6.3 earthquake that struck the central Italian city and province of L'Aquila during the night of 6 April 2009. The aim is to create a profile of the deaths in terms of age, gender, location, behaviour during the tremors, and other aspects. This could help predict the pattern of casualties and priorities for protection in future earthquakes. To establish a basis for analysis, the literature on seismic mortality is surveyed. The conclusions of previous studies are synthesised regarding patterns of mortality, entrapment, survival times, self-protective behaviour, gender and age. These factors are investigated for the data set covering the 308 fatalities in the L'Aquila earthquake, with help from interview data on behavioural factors obtained from 250 survivors. In this data set, there is a strong bias towards victimisation of young people, the elderly and women. Part of this can be explained by geographical factors regarding building performance: the rest of the explanation refers to the vulnerability of the elderly and the relationship between perception and action among female victims, who tend to be more fatalistic than men and thus did not abandon their homes between a major foreshock and the main shock of the earthquake, three hours later. In terms of casualties, earthquakes commonly discriminate against the elderly and women. Age and gender biases need further investigation and should be taken into account in seismic mitigation initiatives.

  3. Recent applications for rapid estimation of earthquake shaking and losses with ELER Software

    International Nuclear Information System (INIS)

    Demircioglu, M.B.; Erdik, M.; Kamer, Y.; Sesetyan, K.; Tuzun, C.

    2012-01-01

    A methodology and software package entitled Earthquake Loss Estimation Routine (ELER) was developed for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region. The work was carried out under the Joint Research Activity-3 (JRA3) of the EC FP6 project entitled Network of Research Infrastructures for European Seismology (NERIES). The ELER methodology anticipates: 1) finding of the most likely location of the source of the earthquake using regional seismo-tectonic data base; 2) estimation of the spatial distribution of selected ground motion parameters at engineering bedrock through region specific ground motion prediction models, bias-correcting the ground motion estimations with strong ground motion data, if available; 3) estimation of the spatial distribution of site-corrected ground motion parameters using regional geology database using appropriate amplification models; and 4) estimation of the losses and uncertainties at various orders of sophistication (buildings, casualties). The multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships which are coded into ELER. The present paper provides brief information on the methodology of ELER and provides an example application with the recent major earthquake that hit the Van province in the east of Turkey on 23 October 2011 with moment magnitude (Mw) of 7.2. For this earthquake, Kandilli Observatory and Earthquake Research Institute (KOERI) provided almost real time estimations in terms of building damage and casualty distribution using ELER. (author)

  4. Density-dependent microbial turnover improves soil carbon model predictions of long-term litter manipulations

    Science.gov (United States)

    Georgiou, Katerina; Abramoff, Rose; Harte, John; Riley, William; Torn, Margaret

    2017-04-01

    Climatic, atmospheric, and land-use changes all have the potential to alter soil microbial activity via abiotic effects on soil or mediated by changes in plant inputs. Recently, many promising microbial models of soil organic carbon (SOC) decomposition have been proposed to advance understanding and prediction of climate and carbon (C) feedbacks. Most of these models, however, exhibit unrealistic oscillatory behavior and SOC insensitivity to long-term changes in C inputs. Here we diagnose the sources of instability in four models that span the range of complexity of these recent microbial models, by sequentially adding complexity to a simple model to include microbial physiology, a mineral sorption isotherm, and enzyme dynamics. We propose a formulation that introduces density-dependence of microbial turnover, which acts to limit population sizes and reduce oscillations. We compare these models to results from 24 long-term C-input field manipulations, including the Detritus Input and Removal Treatment (DIRT) experiments, to show that there are clear metrics that can be used to distinguish and validate the inherent dynamics of each model structure. We find that widely used first-order models and microbial models without density-dependence cannot readily capture the range of long-term responses observed across the DIRT experiments as a direct consequence of their model structures. The proposed formulation improves predictions of long-term C-input changes, and implies greater SOC storage associated with CO2-fertilization-driven increases in C inputs over the coming century compared to common microbial models. Finally, we discuss our findings in the context of improving microbial model behavior for inclusion in Earth System Models.

  5. Vegetation cover, tidal amplitude and land area predict short-term marsh vulnerability in Coastal Louisiana

    Science.gov (United States)

    Schoolmaster, Donald; Stagg, Camille L.; Sharp, Leigh Anne; McGinnis, Tommy S.; Wood, Bernard; Piazza, Sarai

    2018-01-01

    The loss of coastal marshes is a topic of great concern, because these habitats provide tangible ecosystem services and are at risk from sea-level rise and human activities. In recent years, significant effort has gone into understanding and modeling the relationships between the biological and physical factors that contribute to marsh stability. Simulation-based process models suggest that marsh stability is the product of a complex feedback between sediment supply, flooding regime and vegetation response, resulting in elevation gains sufficient to match the combination of relative sea-level rise and losses from erosion. However, there have been few direct, empirical tests of these models, because long-term datasets that have captured sufficient numbers of marsh loss events in the context of a rigorous monitoring program are rare. We use a multi-year data set collected by the Coastwide Reference Monitoring System (CRMS) that includes transitions of monitored vegetation plots to open water to build and test a predictive model of near-term marsh vulnerability. We found that despite the conclusions of previous process models, elevation change had no ability to predict the transition of vegetated marsh to open water. However, we found that the processes that drive elevation change were significant predictors of transitions. Specifically, vegetation cover in prior year, land area in the surrounding 1 km2 (an estimate of marsh fragmentation), and the interaction of tidal amplitude and position in tidal frame were all significant factors predicting marsh loss. This suggests that 1) elevation change is likely better a predictor of marsh loss at time scales longer than we consider in this study and 2) the significant predictive factors affect marsh vulnerability through pathways other than elevation change, such as resistance to erosion. In addition, we found that, while sensitivity of marsh vulnerability to the predictive factors varied spatially across coastal Louisiana

  6. Implicit attitudes towards smoking predict long-term relapse in abstinent smokers.

    Science.gov (United States)

    Spruyt, Adriaan; Lemaigre, Valentine; Salhi, Bihiyga; Van Gucht, Dinska; Tibboel, Helen; Van Bockstaele, Bram; De Houwer, Jan; Van Meerbeeck, Jan; Nackaerts, Kristiaan

    2015-07-01

    It has previously been argued that implicit attitudes toward substance-related cues drive addictive behavior. Nevertheless, it remains an open question whether behavioral markers of implicit attitude activation can be used to predict long-term relapse. The main objective of this study was to examine the relationship between implicit attitudes toward smoking-related cues and long-term relapse in abstaining smokers. Implicit attitudes toward smoking-related cues were assessed by means of the Implicit Association Test (IAT) and the evaluative priming task (EPT). Both measures were completed by a group of smokers who volunteered to quit smoking (patient group) and a group of nonsmokers (control group). Participants in the patient group completed these measures twice: once prior to smoking cessation and once after smoking cessation. Relapse was assessed by means of short telephone survey, 6 months after completion of the second test session. EPT scores obtained prior to smoking cessation were related to long-term relapse and correlated with self-reported nicotine dependence as well as daily cigarette consumption. In contrast, none of the behavioral outcome measures were found to correlate with the IAT scores. These findings corroborate the idea that implicit attitudes toward substance-related cues are critically involved in long-term relapse. A potential explanation for the divergent findings obtained with the IAT and EPT is provided.

  7. Reduced Right Ventricular Function Predicts Long-Term Cardiac Re-Hospitalization after Cardiac Surgery.

    Directory of Open Access Journals (Sweden)

    Leela K Lella

    Full Text Available The significance of right ventricular ejection fraction (RVEF, independent of left ventricular ejection fraction (LVEF, following isolated coronary artery bypass grafting (CABG and valve procedures remains unknown. The aim of this study is to examine the significance of abnormal RVEF by cardiac magnetic resonance (CMR, independent of LVEF in predicting outcomes of patients undergoing isolated CABG and valve surgery.From 2007 to 2009, 109 consecutive patients (mean age, 66 years; 38% female were referred for pre-operative CMR. Abnormal RVEF and LVEF were considered 30 days outcomes included, cardiac re-hospitalization, worsening congestive heart failure and mortality. Mean clinical follow up was 14 months.Forty-eight patients had reduced RVEF (mean 25% and 61 patients had normal RVEF (mean 50% (p<0.001. Fifty-four patients had reduced LVEF (mean 30% and 55 patients had normal LVEF (mean 59% (p<0.001. Patients with reduced RVEF had a higher incidence of long-term cardiac re-hospitalization vs. patients with normal RVEF (31% vs.13%, p<0.05. Abnormal RVEF was a predictor for long-term cardiac re-hospitalization (HR 3.01 [CI 1.5-7.9], p<0.03. Reduced LVEF did not influence long-term cardiac re-hospitalization.Abnormal RVEF is a stronger predictor for long-term cardiac re-hospitalization than abnormal LVEF in patients undergoing isolated CABG and valve procedures.

  8. Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity.

    Science.gov (United States)

    Costa, Rui Ponte; Padamsey, Zahid; D'Amour, James A; Emptage, Nigel J; Froemke, Robert C; Vogels, Tim P

    2017-09-27

    Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Ordinary kriging approach to predicting long-term particulate matter concentrations in seven major Korean cities

    Directory of Open Access Journals (Sweden)

    Sun-Young Kim

    2014-09-01

    Full Text Available Objectives Cohort studies of associations between air pollution and health have used exposure prediction approaches to estimate individual-level concentrations. A common prediction method used in Korean cohort studies is ordinary kriging. In this study, performance of ordinary kriging models for long-term particulate matter less than or equal to 10 μm in diameter (PM10 concentrations in seven major Korean cities was investigated with a focus on spatial prediction ability. Methods We obtained hourly PM10 data for 2010 at 226 urban-ambient monitoring sites in South Korea and computed annual average PM10 concentrations at each site. Given the annual averages, we developed ordinary kriging prediction models for each of the seven major cities and for the entire country by using an exponential covariance reference model and a maximum likelihood estimation method. For model evaluation, cross-validation was performed and mean square error and R-squared (R2 statistics were computed. Results Mean annual average PM10 concentrations in the seven major cities ranged between 45.5 and 66.0 μg/m3 (standard deviation=2.40 and 9.51 μg/m3, respectively. Cross-validated R2 values in Seoul and Busan were 0.31 and 0.23, respectively, whereas the other five cities had R2 values of zero. The national model produced a higher crossvalidated R2 (0.36 than those for the city-specific models. Conclusions In general, the ordinary kriging models performed poorly for the seven major cities and the entire country of South Korea, but the model performance was better in the national model. To improve model performance, future studies should examine different prediction approaches that incorporate PM10 source characteristics.

  10. Risk score for predicting long-term mortality after coronary artery bypass graft surgery.

    Science.gov (United States)

    Wu, Chuntao; Camacho, Fabian T; Wechsler, Andrew S; Lahey, Stephen; Culliford, Alfred T; Jordan, Desmond; Gold, Jeffrey P; Higgins, Robert S D; Smith, Craig R; Hannan, Edward L

    2012-05-22

    No simplified bedside risk scores have been created to predict long-term mortality after coronary artery bypass graft surgery. The New York State Cardiac Surgery Reporting System was used to identify 8597 patients who underwent isolated coronary artery bypass graft surgery in July through December 2000. The National Death Index was used to ascertain patients' vital statuses through December 31, 2007. A Cox proportional hazards model was fit to predict death after CABG surgery using preprocedural risk factors. Then, points were assigned to significant predictors of death on the basis of the values of their regression coefficients. For each possible point total, the predicted risks of death at years 1, 3, 5, and 7 were calculated. It was found that the 7-year mortality rate was 24.2 in the study population. Significant predictors of death included age, body mass index, ejection fraction, unstable hemodynamic state or shock, left main coronary artery disease, cerebrovascular disease, peripheral arterial disease, congestive heart failure, malignant ventricular arrhythmia, chronic obstructive pulmonary disease, diabetes mellitus, renal failure, and history of open heart surgery. The points assigned to these risk factors ranged from 1 to 7; possible point totals for each patient ranged from 0 to 28. The observed and predicted risks of death at years 1, 3, 5, and 7 across patient groups stratified by point totals were highly correlated. The simplified risk score accurately predicted the risk of mortality after coronary artery bypass graft surgery and can be used for informed consent and as an aid in determining treatment choice.

  11. Evaluation of Earthquake Detection Performance in Terms of Quality and Speed in SEISCOMP3 Using New Modules Qceval, Npeval and Sceval

    Science.gov (United States)

    Roessler, D.; Weber, B.; Ellguth, E.; Spazier, J.

    2017-12-01

    The geometry of seismic monitoring networks, site conditions and data availability as well as monitoring targets and strategies typically impose trade-offs between data quality, earthquake detection sensitivity, false detections and alert times. Network detection capabilities typically change with alteration of the seismic noise level by human activity or by varying weather and sea conditions. To give helpful information to operators and maintenance coordinators, gempa developed a range of tools to evaluate earthquake detection and network performance including qceval, npeval and sceval. qceval is a module which analyzes waveform quality parameters in real-time and deactivates and reactivates data streams based on waveform quality thresholds for automatic processing. For example, thresholds can be defined for latency, delay, timing quality, spikes and gaps count and rms. As changes in the automatic processing have a direct influence on detection quality and speed, another tool called "npeval" was designed to calculate in real-time the expected time needed to detect and locate earthquakes by evaluating the effective network geometry. The effective network geometry is derived from the configuration of stations participating in the detection. The detection times are shown as an additional layer on the map and updated in real-time as soon as the effective network geometry changes. Yet another new tool, "sceval", is an automatic module which classifies located seismic events (Origins) in real-time. sceval evaluates the spatial distribution of the stations contributing to an Origin. It confirms or rejects the status of Origins, adds comments or leaves the Origin unclassified. The comments are passed to an additional sceval plug-in where the end user can customize event types. This unique identification of real and fake events in earthquake catalogues allows to lower network detection thresholds. In real-time monitoring situations operators can limit the processing to

  12. Performance of wire-type Rn detectors operated with gas gain in ambient air in view of its possible application to early earthquake predictions

    CERN Document Server

    Charpak, Georges; Breuil, P; Nappi, E; Martinengo, P; Peskov, V

    2010-01-01

    We describe a detector of alpha particles based on wire type counters (single-wire and multiwire) operating in ambient air at high gas gains (100-1000). The main advantages of these detectors are: low cost, robustness and ability to operate in humid air. The minimum detectable activity achieved with the multiwire detector for an integration time of 1 min is 140 Bq per m3, which is comparable to that featured by commercial devices. Owing to such features the detector is suited for massive application, for example for continuous monitoring of Rn or Po contaminations or, as discussed in the paper, its use in a network of Rn counters in areas affected by earth-quakes in order to verify, on a solid statistical basis, the envisaged correlation between the sudden Rn appearance and a forthcoming earthquake.

  13. Taylor Series-Based Long-Term Creep-Life Prediction of Alloy 617

    International Nuclear Information System (INIS)

    Yin, Song Nan; Kim, Woo Gon; Kim, Yong Wan; Park, Jae Young; Kim, Soen Jin

    2010-01-01

    In this study, a Taylor series (T-S) model based on the Arrhenius, McVetty, and Monkman-Grant equations was developed using a mathematical analysis. In order to reduce fitting errors, the McVetty equation was transformed by considering the first three terms of the Taylor series equation. The model parameters were accurately determined by a statistical technique of maximum likelihood estimation, and this model was applied to the creep data of alloy 617. The T-S model results showed better agreement with the experimental data than other models such as the Eno, exponential, and L-M models. In particular, the T-S model was converted into an isothermal Taylor series (IT-S) model that can predict the creep strength at a given temperature. It was identified that the estimations obtained using the converted ITS model was better than that obtained using the T-S model for predicting the long-term creep life of alloy 617

  14. MAGIC biomarkers predict long term outcomes for steroid-resistant acute GVHD.

    Science.gov (United States)

    Major-Monfried, Hannah; Renteria, Anne S; Pawarode, Attaphol; Reddy, Pavan; Ayuk, Francis; Holler, Ernst; Efebera, Yvonne A; Hogan, William J; Wölfl, Matthias; Qayed, Muna; Hexner, Elizabeth O; Wudhikarn, Kitsada; Ordemann, Rainer; Young, Rachel; Shah, Jay; Hartwell, Matthew J; Chaudhry, Mohammed; Aziz, Mina; Etra, Aaron; Yanik, Gregory A; Kröger, Nicolaus; Weber, Daniela; Chen, Yi-Bin; Nakamura, Ryotaro; Rösler, Wolf; Kitko, Carrie L; Harris, Andrew C; Pulsipher, Michael; Reshef, Ran; Kowalyk, Steven; Morales, George; Torres, Ivan; Özbek, Umut; Ferrara, James L M; Levine, John E

    2018-03-15

    Acute graft versus host disease (GVHD) is treated with systemic corticosteroid immunosuppression. Clinical response after one week of therapy often guides further treatment decisions, but long term outcomes vary widely between centers and more accurate predictive tests are urgently needed. We analyzed clinical data and blood samples taken after one week of systemic treatment for GVHD from 507 patients from 17 centers of the Mount Sinai Acute GVHD International Consortium (MAGIC), dividing them into test (n=236) and two validation cohorts separated in time (n = 142 and 129, respectively). Initial response to systemic steroids correlated with response at four weeks, one-year non-relapse mortality (NRM) and overall survival (OS). A previously validated algorithm of two MAGIC biomarkers (ST2 and REG3α) consistently separated steroid resistant patients into two groups with dramatically different NRM and OS (p<0.001 for all three cohorts). High biomarker probability, resistance to steroids and GVHD severity (Minnesota risk) were all significant predictors of NRM in multivariate analysis. A direct comparison of receiver operating curves showed the area under the curve for biomarker probability (0.82) was significantly greater than that for steroid response (0.68, p=0.004) and for Minnesota risk (0.72, p=0.005). In conclusion, MAGIC biomarker probabilities generated after one week of systemic treatment for GVHD predict long term outcomes in steroid resistant GVHD better than clinical criteria and should prove useful in developing better treatment strategies. Copyright © 2018 American Society of Hematology.

  15. A review on the young history of the wind power short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Alexandre; Navarro, Jorge [Wind Energy, Division of Renewable Energies, Department of Energy, CIEMAT, Av. Complutense, 22, Ed. 42, 28044 Madrid (Spain); Crespo, Antonio [Laboratorio de Mecanica de Fluidos, Departmento de Ingenieria Energetica y Fluidomecanica, ETSII, Universidad Politecnica de Madrid, C/Jose Gutierrez Abascal, 2-28006 Madrid (Spain); Lizcano, Gil [Oxford University Centre for the Environment, University of Oxford, South Parks Road, Oxford OX1 3QY (United Kingdom); Madsen, Henrik [Informatics and Mathematical Modelling - IMM, Technical University of Denmark, Richard Petersens Plads, Building 321, Office 019, 2800 Kgs. Lyngby (Denmark); Feitosa, Everaldo [Brazilian Wind Energy Centre - CBEE, Centro de Tecnologia e Geociencias, UFPE-50.740-530 Recife, PE (Brazil)

    2008-08-15

    This paper makes a brief review on 30 years of history of the wind power short-term prediction, since the first ideas and sketches on the theme to the actual state of the art on models and tools, giving emphasis to the most significant proposals and developments. The two principal lines of thought on short-term prediction (mathematical and physical) are indistinctly treated here and comparisons between models and tools are avoided, mainly because, on the one hand, a standard for a measure of performance is still not adopted and, on the other hand, it is very important that the data are exactly the same in order to compare two models (this fact makes it almost impossible to carry out a quantitative comparison between a huge number of models and methods). In place of a quantitative description, a qualitative approach is preferred for this review, remarking the contribution (and innovative aspect) of each model. On the basis of the review, some topics for future research are pointed out. (author)

  16. Published attenuation functions compared to 6/29/1992 Little Skull Mountain earthquake motion

    International Nuclear Information System (INIS)

    Hofmann, R.B.; Ibrahim, A.K.

    1994-01-01

    Several western U.S. strong motion acceleration earthquake attenuation functions are compared to peak accelerations recorded during the 6/29/1992 Little Skull Mountain, Nevada earthquake. The comparison revealed that there are several definitions of site-to-source distance and at least two definitions of peak acceleration in use. Probabilistic seismic hazard analysis (PSHA) codes typically estimate accelerations assuming point sources. The computer code, SEISM 1, was developed for the eastern U.S. where ground acceleration is usually defined in terms of epicentral distance. Formulae whose distance definitions require knowledge of the earthquake fault slip zone dimensions may predict very different near-field accelerations when epicentral distance is used. Approximations to achieve more consistent PSHA results are derived

  17. Short-term Prediction of Coronary Heart Disease Mortality in the Czech Republic Based on Data from 1968-2014.

    Czech Academy of Sciences Publication Activity Database

    Reissigová, Jindra; Zvolský, M.

    2018-01-01

    Roč. 26, č. 1 (2018), s. 10-15 ISSN 1210-7778 Institutional support: RVO:67985807 Keywords : mortality * coronary heart diseases * short-term prediction * long-term prediction * national health registries Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Applied mathematics Impact factor: 0.682, year: 2016 https://cejph.szu.cz/artkey/cjp-201801-0002_short-term-prediction-of-coronary- heart -disease-mortality-in-the-czech-republic-based-on-data-from-1968-2014.php

  18. Seismomagnetic effects from the long-awaited 28 September 2004 M 6.0 parkfield earthquake

    Science.gov (United States)

    Johnston, M.J.S.; Sasai, Y.; Egbert, G.D.; Mueller, R.J.

    2006-01-01

    Precise measurements of local magnetic fields have been obtained with a differentially connected array of seven synchronized proton magnetometers located along 60 km of the locked-to-creeping transition region of the San Andreas fault at Parkfield, California, since 1976. The M 6.0 Parkfield earthquake on 28 September 2004, occurred within this array and generated coseismic magnetic field changes of between 0.2 and 0.5 nT at five sites in the network. No preseismic magnetic field changes exceeding background noise levels are apparent in the magnetic data during the month, week, and days before the earthquake (or expected in light of the absence of measurable precursive deformation, seismicity, or pore pressure changes). Observations of electric and magnetic fields from 0.01 to 20 Hz are also made at one site near the end of the earthquake rupture and corrected for common-mode signals from the ionosphere/magnetosphere using a second site some 115 km to the northwest along the fault. These magnetic data show no indications of unusual noise before the earthquake in the ULF band (0.01-20 Hz) as suggested may have preceded the 1989 ML 7.1 Loma Prieta earthquake. Nor do we see electric field changes similar to those suggested to occur before earthquakes of this magnitude from data in Greece. Uniform and variable slip piezomagnetic models of the earthquake, derived from strain, displacement, and seismic data, generate magnetic field perturbations that are consistent with those observed by the magnetometer array. A higher rate of longer-term magnetic field change, consistent with increased loading in the region, is apparent since 1993. This accompanied an increased rate of secular shear strain observed on a two-color EDM network and a small network of borehole tensor strainmeters and increased seismicity dominated by three M 4.5-5 earthquakes roughly a year apart in 1992, 1993, and 1994. Models incorporating all of these data indicate increased slip at depth in the region

  19. Prediction of hyperbilirubinemia by noninvasive methods in full-term newborns

    Directory of Open Access Journals (Sweden)

    Danijela Furlan

    2013-02-01

    Full Text Available Introduction: The noninvasive screening methods for bilirubin determination were studied prospectively in a group of full-term healthy newborns with the aim of early prediction of pathological neonatal hyperbilirubinemia. Laboratory determination of bilirubin (Jendrassik-Grof (JG was compared to the noninvasive transcutaneous bilirubin (TcBIL together with the determination of bilirubin in cord blood.Methods: The study group consisted of 284 full-term healthy consecutively born infants in the period from March to June 2011. The whole group was divided into a group of physiological (n=199, and a group of pathological hyperbilirubinemia (n=85 according to the level of total bilirubin (220 μmol/L. Bilirubin in cord blood (CbBIL and from capillary blood at the age of three days was determined according to the JG, on the 3rd day TcBIL was also detected by Bilicheck bilirubinometer. The Kolmogorov-Smirnov and Mann-Whitney tests were used for the statistical analysis.Results: Bilirubin concentrations were statisti cally significantly different (CbBIL (p<0,001 on the 3rd day control sample (p<0,001, TcBil (p<0,001 between the groups of newborns with physiological (n=199 and pathological (n=85 hyperbilirubinemia. Using the cut-off value of cord blood bilirubin 28 μmol/L, we could predict the development of pathological hyperbiliru binemia with 98.8% prognostic specificity, and with 100% sensitivity that newborns will not require a phototherapy (all irradiated newborns were taken into account. We confirmed an excellent agreement between bilirubin concentrations determined by the TcBIL and JG methods for both groups of healthy full-term newborns.Conclusion: Based on our results, we could recommend that determination of the cord blood bilirubin in combination with the measurement of TcBIL should be implemented into practice for early prediction of pathological hyperbilirubinemia in full-term healthy newborns. The advantages of both methods in the routine

  20. Predicting Discharge to Institutional Long-Term Care After Stroke: A Systematic Review and Metaanalysis.

    Science.gov (United States)

    Burton, Jennifer K; Ferguson, Eilidh E C; Barugh, Amanda J; Walesby, Katherine E; MacLullich, Alasdair M J; Shenkin, Susan D; Quinn, Terry J

    2018-01-01

    Stroke is a leading cause of disability worldwide, and a significant proportion of stroke survivors require long-term institutional care. Understanding who cannot be discharged home is important for health and social care planning. Our aim was to establish predictive factors for discharge to institutional care after hospitalization for stroke. We registered and conducted a systematic review and meta-analysis (PROSPERO: CRD42015023497) of observational studies. We searched MEDLINE, EMBASE, and CINAHL Plus to February 2017. Quantitative synthesis was performed where data allowed. Acute and rehabilitation hospitals. Adults hospitalized for stroke who were newly admitted directly to long-term institutional care at the time of hospital discharge. Factors associated with new institutionalization. From 10,420 records, we included 18 studies (n = 32,139 participants). The studies were heterogeneous and conducted in Europe, North America, and East Asia. Eight studies were at high risk of selection bias. The proportion of those surviving to discharge who were newly discharged to long-term care varied from 7% to 39% (median 17%, interquartile range 12%), and the model of care received in the long-term care setting was not defined. Older age and greater stroke severity had a consistently positive association with the need for long-term care admission. Individuals who had a severe stroke were 26 times as likely to be admitted to long-term care than those who had a minor stroke. Individuals aged 65 and older had a risk of stroke that was three times as great as that of younger individuals. Potentially modifiable factors were rarely examined. Age and stroke severity are important predictors of institutional long-term care admission directly from the hospital after an acute stroke. Potentially modifiable factors should be the target of future research. Stroke outcome studies should report discharge destination, defining the model of care provided in the long-term care setting.

  1. DOP prediction over Egypt from SP3 file for long-term

    Directory of Open Access Journals (Sweden)

    Aly M. El-naggar

    2012-09-01

    Full Text Available An important error source in satellite surveying deals with the geometry of the visible satellite constellation at the time of observation. This is similar to the situation in traditional surveys, where the geometry of the network of observed ground stations affects the accuracies of computed positions. Dilution of precision (DOP is an indicator of the quality of the geometry of the visible satellite constellation. The value of DOP is very important during observation sessions, because many projects have special requirements regarding the positioning accuracy. These special requirements will force the surveyor to hold the survey mission until the value of dilution of precision at the time of observation meets the project’s requirements. This means that the GPS mission planning is an essential task before any GPS survey. The task of this paper is related to The GPS mission planning which the values of dilution of precisions and number of visible satellites should be predicted for the observation points, this task should determine the best observation periods which meet the project requirements. This paper explore the process of DOP prediction for long term by using Standard Product 3 (SP3 file and attempt to mathematically or statistically bound the navigation error. In this paper the actual orbital period of GPS satellite was defined and the DOP contour map over Egypt was constructed. By using these contour maps one can predict the DOP by knowing the city name.

  2. Sensitivity Analysis of Wavelet Neural Network Model for Short-Term Traffic Volume Prediction

    Directory of Open Access Journals (Sweden)

    Jinxing Shen

    2013-01-01

    Full Text Available In order to achieve a more accurate and robust traffic volume prediction model, the sensitivity of wavelet neural network model (WNNM is analyzed in this study. Based on real loop detector data which is provided by traffic police detachment of Maanshan, WNNM is discussed with different numbers of input neurons, different number of hidden neurons, and traffic volume for different time intervals. The test results show that the performance of WNNM depends heavily on network parameters and time interval of traffic volume. In addition, the WNNM with 4 input neurons and 6 hidden neurons is the optimal predictor with more accuracy, stability, and adaptability. At the same time, a much better prediction record will be achieved with the time interval of traffic volume are 15 minutes. In addition, the optimized WNNM is compared with the widely used back-propagation neural network (BPNN. The comparison results indicated that WNNM produce much lower values of MAE, MAPE, and VAPE than BPNN, which proves that WNNM performs better on short-term traffic volume prediction.

  3. Prediction of tritium behavior in rice plant after a short-term exposure of HTO

    International Nuclear Information System (INIS)

    Yook, Dae Sik; Lee, Kun Jai; Choi, Heui Joo; Lee, Chang Min

    2001-01-01

    In many Asian countries including Korea, rice is a very important food crop. Its grain is consumed by humans and its straw is used to feed animals. Because four CANDU reactors are in operation in Korea, relatively large amounts of tritium are released into the environment and the dose by these tritium in the rice plant must be estimated. Since 1997, KAERI (Korea Atomic Energy Research Institute) has carried out experimental studies to obtain domestic data on various parameters related to the direct tritium contamination of plant. But the analysis of the tritium behavior in the rice plant has been insufficient. In this study, the behavior of the tritium in the rice plant is predicted and compared with the measurement performed at KAERI. Using the conceptual model of the soil-plant-atmosphere tritiated water transport system which was suggested by Charles E. Murphy, transient tritium concentrations in soil and leaves were predicted. If the effect of tritium concentration in the soil is taken into account, the tritium concentration in leaves can be described by a double exponential model, however if the tritium concentration in the soil is disregarded, the tritium concentration in leaves can be described by a single exponential term like other relevant models e.g. UFOTRI or STAR-H3 model. The results can be used to predict the tritium concentration in the rice plant near the plant site and to estimate the ingestion dose after the release of tritium to the environment

  4. A Gaussian process regression based hybrid approach for short-term wind speed prediction

    International Nuclear Information System (INIS)

    Zhang, Chi; Wei, Haikun; Zhao, Xin; Liu, Tianhong; Zhang, Kanjian

    2016-01-01

    Highlights: • A novel hybrid approach is proposed for short-term wind speed prediction. • This method combines the parametric AR model with the non-parametric GPR model. • The relative importance of different inputs is considered. • Different types of covariance functions are considered and combined. • It can provide both accurate point forecasts and satisfactory prediction intervals. - Abstract: This paper proposes a hybrid model based on autoregressive (AR) model and Gaussian process regression (GPR) for probabilistic wind speed forecasting. In the proposed approach, the AR model is employed to capture the overall structure from wind speed series, and the GPR is adopted to extract the local structure. Additionally, automatic relevance determination (ARD) is used to take into account the relative importance of different inputs, and different types of covariance functions are combined to capture the characteristics of the data. The proposed hybrid model is compared with the persistence model, artificial neural network (ANN), and support vector machine (SVM) for one-step ahead forecasting, using wind speed data collected from three wind farms in China. The forecasting results indicate that the proposed method can not only improve point forecasts compared with other methods, but also generate satisfactory prediction intervals.

  5. Markers of preparatory attention predict visual short-term memory performance.

    Science.gov (United States)

    Murray, Alexandra M; Nobre, Anna C; Stokes, Mark G

    2011-05-01

    Visual short-term memory (VSTM) is limited in capacity. Therefore, it is important to encode only visual information that is most likely to be relevant to behaviour. Here we asked which aspects of selective biasing of VSTM encoding predict subsequent memory-based performance. We measured EEG during a selective VSTM encoding task, in which we varied parametrically the memory load and the precision of recall required to compare a remembered item to a subsequent probe item. On half the trials, a spatial cue indicated that participants only needed to encode items from one hemifield. We observed a typical sequence of markers of anticipatory spatial attention: early attention directing negativity (EDAN), anterior attention directing negativity (ADAN), late directing attention positivity (LDAP); as well as of VSTM maintenance: contralateral delay activity (CDA). We found that individual differences in preparatory brain activity (EDAN/ADAN) predicted cue-related changes in recall accuracy, indexed by memory-probe discrimination sensitivity (d'). Importantly, our parametric manipulation of memory-probe similarity also allowed us to model the behavioural data for each participant, providing estimates for the quality of the memory representation and the probability that an item could be retrieved. We found that selective encoding primarily increased the probability of accurate memory recall; that ERP markers of preparatory attention predicted the cue-related changes in recall probability. Copyright © 2011. Published by Elsevier Ltd.

  6. Antioxidant defenses predict long-term survival in a passerine bird.

    Directory of Open Access Journals (Sweden)

    Nicola Saino

    2011-05-01

    Full Text Available Normal and pathological processes entail the production of oxidative substances that can damage biological molecules and harm physiological functions. Organisms have evolved complex mechanisms of antioxidant defense, and any imbalance between oxidative challenge and antioxidant protection can depress fitness components and accelerate senescence. While the role of oxidative stress in pathogenesis and aging has been studied intensively in humans and model animal species under laboratory conditions, there is a dearth of knowledge on its role in shaping life-histories of animals under natural selection regimes. Yet, given the pervasive nature and likely fitness consequences of oxidative damage, it can be expected that the need to secure efficient antioxidant protection is powerful in molding the evolutionary ecology of animals. Here, we test whether overall antioxidant defense varies with age and predicts long-term survival, using a wild population of a migratory passerine bird, the barn swallow (Hirundo rustica, as a model.Plasma antioxidant capacity (AOC of breeding individuals was measured using standard protocols and annual survival was monitored over five years (2006-2010 on a large sample of selection episodes. AOC did not covary with age in longitudinal analyses after discounting the effect of selection. AOC positively predicted annual survival independently of sex. Individuals were highly consistent in their relative levels of AOC, implying the existence of additive genetic variance and/or environmental (including early maternal components consistently acting through their lives.Using longitudinal data we showed that high levels of antioxidant protection positively predict long-term survival in a wild animal population. Present results are therefore novel in disclosing a role for antioxidant protection in determining survival under natural conditions, strongly demanding for more longitudinal eco-physiological studies of life-histories in

  7. Long-term response to recombinant human growth hormone treatment: a new predictive mathematical method.

    Science.gov (United States)

    Migliaretti, G; Ditaranto, S; Guiot, C; Vannelli, S; Matarazzo, P; Cappello, N; Stura, I; Cavallo, F

    2018-07-01

    Recombinant GH has been offered to GH-deficient (GHD) subjects for more than 30 years, in order to improve height and growth velocity in children and to enhance metabolic effects in adults. The aim of our work is to describe the long-term effect of rhGH treatment in GHD pediatric patients, suggesting a growth prediction model. A homogeneous database is defined for diagnosis and treatment modalities, based on GHD patients afferent to Hospital Regina Margherita in Turin (Italy). In this study, 232 GHD patients are selected (204 idiopathic GHD and 28 organic GHD). Each measure is shown in terms of mean with relative standard deviations (SD) and 95% confidence interval (95% CI). To estimate the final height of each patient on the basis of few measures, a mathematical growth prediction model [based on Gompertzian function and a mixed method based on the radial basis functions (RBFs) and the